Nov 28 13:19:48 crc systemd[1]: Starting Kubernetes Kubelet... Nov 28 13:19:48 crc restorecon[4696]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:48 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 28 13:19:49 crc restorecon[4696]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 28 13:19:49 crc kubenswrapper[4970]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:49 crc kubenswrapper[4970]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 28 13:19:49 crc kubenswrapper[4970]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:49 crc kubenswrapper[4970]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:49 crc kubenswrapper[4970]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 28 13:19:49 crc kubenswrapper[4970]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.252616 4970 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255184 4970 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255200 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255204 4970 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255208 4970 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255228 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255237 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255242 4970 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255247 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255251 4970 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255256 4970 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255259 4970 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255264 4970 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255267 4970 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255270 4970 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255274 4970 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255278 4970 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255282 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255287 4970 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255291 4970 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255295 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255298 4970 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255302 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255305 4970 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255309 4970 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255312 4970 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255316 4970 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255319 4970 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255324 4970 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255328 4970 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255331 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255335 4970 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255339 4970 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255343 4970 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255347 4970 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255351 4970 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255355 4970 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255360 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255364 4970 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255368 4970 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255373 4970 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255377 4970 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255382 4970 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255388 4970 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255393 4970 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255396 4970 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255401 4970 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255405 4970 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255408 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255412 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255416 4970 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255419 4970 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255423 4970 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255426 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255430 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255433 4970 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255437 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255440 4970 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255444 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255447 4970 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255451 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255455 4970 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255458 4970 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255462 4970 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255466 4970 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255469 4970 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255473 4970 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255476 4970 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255480 4970 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255483 4970 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255487 4970 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.255490 4970 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255817 4970 flags.go:64] FLAG: --address="0.0.0.0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255828 4970 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255837 4970 flags.go:64] FLAG: --anonymous-auth="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255843 4970 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255848 4970 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255852 4970 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255858 4970 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255863 4970 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255868 4970 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255872 4970 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255877 4970 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255881 4970 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255885 4970 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255889 4970 flags.go:64] FLAG: --cgroup-root="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255893 4970 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255897 4970 flags.go:64] FLAG: --client-ca-file="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255901 4970 flags.go:64] FLAG: --cloud-config="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255914 4970 flags.go:64] FLAG: --cloud-provider="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255919 4970 flags.go:64] FLAG: --cluster-dns="[]" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255923 4970 flags.go:64] FLAG: --cluster-domain="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255927 4970 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255931 4970 flags.go:64] FLAG: --config-dir="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255935 4970 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255940 4970 flags.go:64] FLAG: --container-log-max-files="5" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255945 4970 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255949 4970 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255953 4970 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255958 4970 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255962 4970 flags.go:64] FLAG: --contention-profiling="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255966 4970 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255972 4970 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255976 4970 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255980 4970 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255985 4970 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255990 4970 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255994 4970 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.255998 4970 flags.go:64] FLAG: --enable-load-reader="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256002 4970 flags.go:64] FLAG: --enable-server="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256006 4970 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256011 4970 flags.go:64] FLAG: --event-burst="100" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256015 4970 flags.go:64] FLAG: --event-qps="50" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256019 4970 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256023 4970 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256027 4970 flags.go:64] FLAG: --eviction-hard="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256032 4970 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256037 4970 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256041 4970 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256045 4970 flags.go:64] FLAG: --eviction-soft="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256049 4970 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256053 4970 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256057 4970 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256061 4970 flags.go:64] FLAG: --experimental-mounter-path="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256065 4970 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256069 4970 flags.go:64] FLAG: --fail-swap-on="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256073 4970 flags.go:64] FLAG: --feature-gates="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256078 4970 flags.go:64] FLAG: --file-check-frequency="20s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256082 4970 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256086 4970 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256090 4970 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256094 4970 flags.go:64] FLAG: --healthz-port="10248" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256099 4970 flags.go:64] FLAG: --help="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256103 4970 flags.go:64] FLAG: --hostname-override="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256107 4970 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256111 4970 flags.go:64] FLAG: --http-check-frequency="20s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256115 4970 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256119 4970 flags.go:64] FLAG: --image-credential-provider-config="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256123 4970 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256128 4970 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256131 4970 flags.go:64] FLAG: --image-service-endpoint="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256135 4970 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256139 4970 flags.go:64] FLAG: --kube-api-burst="100" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256144 4970 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256148 4970 flags.go:64] FLAG: --kube-api-qps="50" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256152 4970 flags.go:64] FLAG: --kube-reserved="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256156 4970 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256159 4970 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256164 4970 flags.go:64] FLAG: --kubelet-cgroups="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256168 4970 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256172 4970 flags.go:64] FLAG: --lock-file="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256175 4970 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256180 4970 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256184 4970 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256190 4970 flags.go:64] FLAG: --log-json-split-stream="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256194 4970 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256198 4970 flags.go:64] FLAG: --log-text-split-stream="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256202 4970 flags.go:64] FLAG: --logging-format="text" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256206 4970 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256224 4970 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256229 4970 flags.go:64] FLAG: --manifest-url="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256232 4970 flags.go:64] FLAG: --manifest-url-header="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256238 4970 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256242 4970 flags.go:64] FLAG: --max-open-files="1000000" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256247 4970 flags.go:64] FLAG: --max-pods="110" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256251 4970 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256256 4970 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256259 4970 flags.go:64] FLAG: --memory-manager-policy="None" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256263 4970 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256268 4970 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256274 4970 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256278 4970 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256288 4970 flags.go:64] FLAG: --node-status-max-images="50" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256293 4970 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256297 4970 flags.go:64] FLAG: --oom-score-adj="-999" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256302 4970 flags.go:64] FLAG: --pod-cidr="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256306 4970 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256313 4970 flags.go:64] FLAG: --pod-manifest-path="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256317 4970 flags.go:64] FLAG: --pod-max-pids="-1" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256321 4970 flags.go:64] FLAG: --pods-per-core="0" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256325 4970 flags.go:64] FLAG: --port="10250" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256330 4970 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256333 4970 flags.go:64] FLAG: --provider-id="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256337 4970 flags.go:64] FLAG: --qos-reserved="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256342 4970 flags.go:64] FLAG: --read-only-port="10255" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256346 4970 flags.go:64] FLAG: --register-node="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256350 4970 flags.go:64] FLAG: --register-schedulable="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256354 4970 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256361 4970 flags.go:64] FLAG: --registry-burst="10" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256365 4970 flags.go:64] FLAG: --registry-qps="5" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256369 4970 flags.go:64] FLAG: --reserved-cpus="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256373 4970 flags.go:64] FLAG: --reserved-memory="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256378 4970 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256382 4970 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256387 4970 flags.go:64] FLAG: --rotate-certificates="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256391 4970 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256395 4970 flags.go:64] FLAG: --runonce="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256399 4970 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256403 4970 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256407 4970 flags.go:64] FLAG: --seccomp-default="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256411 4970 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256415 4970 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256419 4970 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256424 4970 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256428 4970 flags.go:64] FLAG: --storage-driver-password="root" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256432 4970 flags.go:64] FLAG: --storage-driver-secure="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256436 4970 flags.go:64] FLAG: --storage-driver-table="stats" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256440 4970 flags.go:64] FLAG: --storage-driver-user="root" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256445 4970 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256449 4970 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256454 4970 flags.go:64] FLAG: --system-cgroups="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256459 4970 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256466 4970 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256470 4970 flags.go:64] FLAG: --tls-cert-file="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256474 4970 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256479 4970 flags.go:64] FLAG: --tls-min-version="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256485 4970 flags.go:64] FLAG: --tls-private-key-file="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256489 4970 flags.go:64] FLAG: --topology-manager-policy="none" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256493 4970 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256497 4970 flags.go:64] FLAG: --topology-manager-scope="container" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256501 4970 flags.go:64] FLAG: --v="2" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256507 4970 flags.go:64] FLAG: --version="false" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256512 4970 flags.go:64] FLAG: --vmodule="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256517 4970 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.256522 4970 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256738 4970 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256745 4970 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256750 4970 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256754 4970 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256759 4970 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256762 4970 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256766 4970 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256770 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256774 4970 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256779 4970 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256783 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256787 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256791 4970 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256795 4970 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256798 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256802 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256805 4970 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256809 4970 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256812 4970 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256816 4970 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256821 4970 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256826 4970 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256830 4970 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256840 4970 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256843 4970 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256847 4970 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256851 4970 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256854 4970 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256858 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256861 4970 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256865 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256869 4970 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256873 4970 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256876 4970 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256879 4970 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256883 4970 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256886 4970 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256890 4970 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256894 4970 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256898 4970 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256901 4970 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256905 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256909 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256912 4970 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256916 4970 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256919 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256924 4970 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256928 4970 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256932 4970 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256936 4970 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256940 4970 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256944 4970 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256947 4970 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256951 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256954 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256959 4970 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256963 4970 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256968 4970 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256972 4970 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256975 4970 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256979 4970 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256982 4970 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256985 4970 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256989 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256993 4970 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.256996 4970 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.257000 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.257004 4970 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.257008 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.257012 4970 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.257015 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.257021 4970 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.264655 4970 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.264689 4970 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265843 4970 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265895 4970 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265902 4970 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265907 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265912 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265916 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265922 4970 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265933 4970 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265937 4970 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265942 4970 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265946 4970 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265951 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265955 4970 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265960 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265964 4970 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265968 4970 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265973 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265977 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265982 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265986 4970 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265994 4970 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.265999 4970 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266004 4970 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266008 4970 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266012 4970 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266016 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266021 4970 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266025 4970 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266031 4970 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266035 4970 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266041 4970 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266045 4970 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266054 4970 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266059 4970 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266064 4970 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266069 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266074 4970 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266080 4970 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266085 4970 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266089 4970 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266095 4970 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266100 4970 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266129 4970 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266135 4970 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266141 4970 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266329 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266336 4970 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266341 4970 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266346 4970 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266351 4970 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266355 4970 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266360 4970 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266364 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266368 4970 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266372 4970 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266393 4970 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266398 4970 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266404 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266409 4970 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266414 4970 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266419 4970 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266425 4970 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266430 4970 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266435 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266440 4970 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266444 4970 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266449 4970 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266453 4970 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266473 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266478 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266482 4970 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.266489 4970 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266650 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266656 4970 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266660 4970 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266667 4970 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266672 4970 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266676 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266682 4970 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266687 4970 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266692 4970 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266696 4970 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266700 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266704 4970 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266708 4970 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266712 4970 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266716 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266720 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266724 4970 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266728 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266732 4970 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266737 4970 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266741 4970 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266745 4970 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266749 4970 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266754 4970 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266758 4970 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266762 4970 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266767 4970 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266771 4970 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266790 4970 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266794 4970 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266798 4970 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266802 4970 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266806 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266810 4970 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266814 4970 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266818 4970 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266823 4970 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266826 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266831 4970 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266836 4970 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266840 4970 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266845 4970 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266850 4970 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266854 4970 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266859 4970 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266864 4970 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266868 4970 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266872 4970 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266876 4970 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266880 4970 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266884 4970 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266888 4970 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266892 4970 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266896 4970 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266900 4970 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266904 4970 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266908 4970 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266912 4970 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266916 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266919 4970 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266923 4970 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266927 4970 feature_gate.go:330] unrecognized feature gate: Example Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266931 4970 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266934 4970 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266938 4970 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266942 4970 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266946 4970 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266950 4970 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266955 4970 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266958 4970 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.266963 4970 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.266969 4970 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.267147 4970 server.go:940] "Client rotation is on, will bootstrap in background" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.269885 4970 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.269958 4970 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.270501 4970 server.go:997] "Starting client certificate rotation" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.270520 4970 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.270787 4970 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-30 06:54:14.089051599 +0000 UTC Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.270829 4970 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 761h34m24.818225056s for next certificate rotation Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.275447 4970 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.276776 4970 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.283757 4970 log.go:25] "Validated CRI v1 runtime API" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.295258 4970 log.go:25] "Validated CRI v1 image API" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.296650 4970 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.299411 4970 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-28-13-15-25-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.299456 4970 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.313961 4970 manager.go:217] Machine: {Timestamp:2025-11-28 13:19:49.312499412 +0000 UTC m=+0.165381232 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e0c25d2f-0dea-4920-bc96-a4a60d7cbbdc BootID:ec97c350-a841-4080-9140-13ff7fd8973f Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b6:39:b5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b6:39:b5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a4:8f:ef Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:94:d4:18 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7d:cc:1d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:44:d2:69 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6a:0d:08:e3:b2:ee Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:0a:19:9f:91:61 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.314300 4970 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.314463 4970 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315079 4970 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315282 4970 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315318 4970 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315537 4970 topology_manager.go:138] "Creating topology manager with none policy" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315547 4970 container_manager_linux.go:303] "Creating device plugin manager" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315708 4970 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315735 4970 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.315953 4970 state_mem.go:36] "Initialized new in-memory state store" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.316122 4970 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.316648 4970 kubelet.go:418] "Attempting to sync node with API server" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.316668 4970 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.316692 4970 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.316705 4970 kubelet.go:324] "Adding apiserver pod source" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.316717 4970 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.321041 4970 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.321434 4970 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.321855 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.321954 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.321921 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.322106 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322151 4970 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322907 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322935 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322944 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322953 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322967 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322976 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322984 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.322998 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.323008 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.323035 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.323056 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.323063 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.323517 4970 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.323954 4970 server.go:1280] "Started kubelet" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.324260 4970 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.324309 4970 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.324256 4970 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.324855 4970 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 28 13:19:49 crc systemd[1]: Started Kubernetes Kubelet. Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.325506 4970 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.325549 4970 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.325818 4970 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 16:01:12.985800437 +0000 UTC Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.325960 4970 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.326587 4970 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.326607 4970 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.326740 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.326776 4970 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.326295 4970 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187c2e3c0416d808 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:19:49.323913224 +0000 UTC m=+0.176795024,LastTimestamp:2025-11-28 13:19:49.323913224 +0000 UTC m=+0.176795024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327404 4970 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.327401 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327435 4970 factory.go:55] Registering systemd factory Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327456 4970 factory.go:221] Registration of the systemd container factory successfully Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.327471 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327799 4970 factory.go:153] Registering CRI-O factory Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327827 4970 factory.go:221] Registration of the crio container factory successfully Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327859 4970 factory.go:103] Registering Raw factory Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.327873 4970 manager.go:1196] Started watching for new ooms in manager Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.328523 4970 manager.go:319] Starting recovery of all containers Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.329009 4970 server.go:460] "Adding debug handlers to kubelet server" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346584 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346657 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346678 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346690 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346702 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346716 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346727 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346739 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346754 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346765 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346776 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.346788 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347456 4970 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347537 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347565 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347584 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347601 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347623 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347639 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347656 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347680 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347700 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347717 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347734 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347752 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347769 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347787 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347820 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347848 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347872 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347890 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347906 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347931 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347948 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347966 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.347986 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348012 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348030 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348090 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348110 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348130 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348147 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348169 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348188 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348238 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348267 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348288 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348306 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348324 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348341 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348367 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348389 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348408 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348433 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348454 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348479 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348532 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348557 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348576 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348594 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348611 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348628 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348645 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348662 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348685 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348703 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348722 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348749 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348768 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348797 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348815 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348831 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348846 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348862 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348879 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348897 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348914 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348931 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348947 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.348989 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349006 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349022 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349039 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349055 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349073 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349094 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349112 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349130 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349149 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349166 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349183 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349200 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349238 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349258 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349277 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349295 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349314 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349333 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349350 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349368 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349386 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349406 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349424 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349441 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349459 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349483 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349505 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349531 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349550 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349571 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349591 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349607 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349624 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349641 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349658 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349675 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349690 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349745 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349764 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349779 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349796 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349813 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349829 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349848 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349865 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349882 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349899 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349916 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349934 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349950 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349975 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.349991 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350008 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350026 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350042 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350057 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350073 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350090 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350107 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350127 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350146 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350162 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350178 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350194 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350210 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350255 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350272 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350289 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350311 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350329 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350353 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350371 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350388 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350404 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350430 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350448 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350464 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350480 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350496 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350514 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350542 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350565 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350583 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350600 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350652 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350678 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350695 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350718 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350737 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350762 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350780 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350799 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350817 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350834 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350851 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350869 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350885 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350901 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350918 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350934 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350950 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350969 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.350986 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351004 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351019 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351034 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351049 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351065 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351081 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351106 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351123 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351139 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351154 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351169 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351187 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351202 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351242 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351262 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351279 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351297 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351314 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351331 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351348 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351364 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351382 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351398 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351415 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351434 4970 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351450 4970 reconstruct.go:97] "Volume reconstruction finished" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.351461 4970 reconciler.go:26] "Reconciler: start to sync state" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.355755 4970 manager.go:324] Recovery completed Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.366012 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.367725 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.367758 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.367769 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.368440 4970 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.368453 4970 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.368468 4970 state_mem.go:36] "Initialized new in-memory state store" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.377884 4970 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.379537 4970 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.379574 4970 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.379601 4970 kubelet.go:2335] "Starting kubelet main sync loop" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.379714 4970 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.380394 4970 policy_none.go:49] "None policy: Start" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.381014 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.381075 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.383798 4970 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.383826 4970 state_mem.go:35] "Initializing new in-memory state store" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.426639 4970 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.437277 4970 manager.go:334] "Starting Device Plugin manager" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.437401 4970 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.437416 4970 server.go:79] "Starting device plugin registration server" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.437882 4970 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.437897 4970 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.439186 4970 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.439290 4970 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.439296 4970 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.443755 4970 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.480093 4970 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.480468 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.481556 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.481608 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.481627 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.481862 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.482198 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.482334 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483117 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483154 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483163 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483122 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483391 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483481 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483701 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483806 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.483837 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485498 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485518 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485526 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485629 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485739 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485780 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485799 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485780 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.485891 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.486356 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.486373 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.486381 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.486447 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.486743 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.486763 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487038 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487074 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487090 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487695 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487728 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487739 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.487907 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.488012 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.488105 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.488325 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.488426 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.489060 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.489081 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.489089 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.527230 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.538143 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.539167 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.539191 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.539201 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.539234 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.539593 4970 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552690 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552723 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552746 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552762 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552784 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552905 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.552986 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553034 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553059 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553083 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553105 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553144 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553179 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553227 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.553252 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654654 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654726 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654761 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654793 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654822 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654851 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654878 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654905 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654935 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654963 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.654992 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655019 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655066 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655095 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655124 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655491 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655552 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655586 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655563 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655610 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655661 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655698 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655700 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655629 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655481 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655527 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655676 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655652 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655669 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.655809 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.740293 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.742902 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.742940 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.742951 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.742997 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.743585 4970 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.805665 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.828363 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-abff9ff0ddf7860521ab0580036ba66653dcbd7cc25319402d1c92584c1e05a3 WatchSource:0}: Error finding container abff9ff0ddf7860521ab0580036ba66653dcbd7cc25319402d1c92584c1e05a3: Status 404 returned error can't find the container with id abff9ff0ddf7860521ab0580036ba66653dcbd7cc25319402d1c92584c1e05a3 Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.828814 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.837152 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.849660 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cc5ec6c22ab319792ccfc7463c51dfef4217c8033e40548f473c6f5e0e750d7c WatchSource:0}: Error finding container cc5ec6c22ab319792ccfc7463c51dfef4217c8033e40548f473c6f5e0e750d7c: Status 404 returned error can't find the container with id cc5ec6c22ab319792ccfc7463c51dfef4217c8033e40548f473c6f5e0e750d7c Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.851282 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4331ab789322d806c6143b2f5603a2416bab740e0713e9a6e65ffd8b2b8b3f54 WatchSource:0}: Error finding container 4331ab789322d806c6143b2f5603a2416bab740e0713e9a6e65ffd8b2b8b3f54: Status 404 returned error can't find the container with id 4331ab789322d806c6143b2f5603a2416bab740e0713e9a6e65ffd8b2b8b3f54 Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.855876 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: I1128 13:19:49.861668 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.871636 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0b9ee60fd3815126ba56960603680dd02119d17ccd8e77ce19b5c174113f1506 WatchSource:0}: Error finding container 0b9ee60fd3815126ba56960603680dd02119d17ccd8e77ce19b5c174113f1506: Status 404 returned error can't find the container with id 0b9ee60fd3815126ba56960603680dd02119d17ccd8e77ce19b5c174113f1506 Nov 28 13:19:49 crc kubenswrapper[4970]: W1128 13:19:49.879772 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cf83ea70ecbec45df5de2f3ecc9bd6102c199fa54312a3e4db46422d43bcf8de WatchSource:0}: Error finding container cf83ea70ecbec45df5de2f3ecc9bd6102c199fa54312a3e4db46422d43bcf8de: Status 404 returned error can't find the container with id cf83ea70ecbec45df5de2f3ecc9bd6102c199fa54312a3e4db46422d43bcf8de Nov 28 13:19:49 crc kubenswrapper[4970]: E1128 13:19:49.927984 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.144376 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.145788 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.145829 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.145838 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.145861 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.146320 4970 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.325765 4970 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.326742 4970 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:48:13.836582242 +0000 UTC Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.387206 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cc5ec6c22ab319792ccfc7463c51dfef4217c8033e40548f473c6f5e0e750d7c"} Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.388820 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"abff9ff0ddf7860521ab0580036ba66653dcbd7cc25319402d1c92584c1e05a3"} Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.390335 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf83ea70ecbec45df5de2f3ecc9bd6102c199fa54312a3e4db46422d43bcf8de"} Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.391646 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0b9ee60fd3815126ba56960603680dd02119d17ccd8e77ce19b5c174113f1506"} Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.392976 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4331ab789322d806c6143b2f5603a2416bab740e0713e9a6e65ffd8b2b8b3f54"} Nov 28 13:19:50 crc kubenswrapper[4970]: W1128 13:19:50.420901 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.421022 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:50 crc kubenswrapper[4970]: W1128 13:19:50.634823 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.634932 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:50 crc kubenswrapper[4970]: W1128 13:19:50.680843 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.680946 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.729449 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Nov 28 13:19:50 crc kubenswrapper[4970]: W1128 13:19:50.923765 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.923886 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.946576 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.948607 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.948657 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.948676 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:50 crc kubenswrapper[4970]: I1128 13:19:50.948708 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:50 crc kubenswrapper[4970]: E1128 13:19:50.949298 4970 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.327169 4970 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 23:33:41.027637654 +0000 UTC Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.327260 4970 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 922h13m49.700384686s for next certificate rotation Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.351530 4970 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.398147 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52" exitCode=0 Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.398258 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52"} Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.398332 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.399719 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.399779 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.399800 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.400512 4970 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="592832f9dde7d730854c8b97f5c97013cfa9ad17815ea96ea3d93d54d9a8d69e" exitCode=0 Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.400603 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"592832f9dde7d730854c8b97f5c97013cfa9ad17815ea96ea3d93d54d9a8d69e"} Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.400632 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.401818 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.401886 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.401910 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.403319 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.404301 4970 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5990c2092133ee60909154d6df6f352b919c65cc4c986937fd5bfd300487e8cf" exitCode=0 Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.404335 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5990c2092133ee60909154d6df6f352b919c65cc4c986937fd5bfd300487e8cf"} Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.404401 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.404614 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.404648 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.404663 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.406253 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.406282 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.406294 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.411861 4970 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f79d3a97b7299fb710bde40fab7d7c044406fcedc4d6f4d64a9d2a6b3153c929" exitCode=0 Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.411944 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f79d3a97b7299fb710bde40fab7d7c044406fcedc4d6f4d64a9d2a6b3153c929"} Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.411967 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.412990 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.413029 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.413047 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:51 crc kubenswrapper[4970]: I1128 13:19:51.414152 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"33cbcaa6257083258112041892758c528dc9c9c165831253ec7892e8c7e2b451"} Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.326369 4970 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Nov 28 13:19:52 crc kubenswrapper[4970]: E1128 13:19:52.331033 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.426385 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f46d2b4c68b84db6acc7f31149e4a0533f4a6cb8ffb9fdf729b6957ff65c2541"} Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.429969 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79b7c5cf7ed967590dbf3a5af0d25db8c95389d8b2e6c85bcbb34f668b2202a5"} Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.432008 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9"} Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.434281 4970 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e8cdfd48b09531200d08fe9d1f41f438ff10877dc9a1cb253001357a21b09950" exitCode=0 Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.434349 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e8cdfd48b09531200d08fe9d1f41f438ff10877dc9a1cb253001357a21b09950"} Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.434476 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.435353 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.435375 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.435387 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.437524 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.437462 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"632579048cea24f0ef229ed13b0ee56bf7ce688f2997c1542e173cb48dc16acd"} Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.439698 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.439744 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.439756 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.550146 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.551384 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.551428 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.551437 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:52 crc kubenswrapper[4970]: I1128 13:19:52.551459 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:52 crc kubenswrapper[4970]: E1128 13:19:52.551860 4970 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.443635 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a60e6243fba7fec0ee681807511bac49ca5b4a3ecdd23880e84394bef94179ca"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.443707 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"026a3ad053b1ea666f8fc2fbb142ba64ae33c743d4762b9fd63060cd3d5c0524"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.443749 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.445090 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.445141 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.445159 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.447299 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"819ea9ab3463a3a642fbb1bea3766d11edfc1933f96078cf745ad49467e05cdb"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.447342 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35e59a27d99f7c69f9385529d5fa968b47b07eaec0030abc43e0b8d444ea481b"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.447367 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.448308 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.448344 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.448360 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.450125 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.450183 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.450202 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452184 4970 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="06078e0f144246acd29170b62e73cab68bddb87e1d7c91b98de26d7a17c240a9" exitCode=0 Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452237 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"06078e0f144246acd29170b62e73cab68bddb87e1d7c91b98de26d7a17c240a9"} Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452264 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452314 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452858 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452884 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.452891 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.453443 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.453476 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.453489 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:53 crc kubenswrapper[4970]: I1128 13:19:53.960857 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.457580 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bc1d1b837f41d077460510ae7411f75f6f4a002d73f33e670c1a133cccfa8315"} Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.457618 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"39d44e7a1c9ef0eac156ad2b20e57ae71ce7c4c1336ef1160935b0207520b1a7"} Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.461670 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3"} Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.461716 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.461796 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.461836 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.461888 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.462796 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.462832 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.462833 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.462865 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.462883 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.462844 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.463397 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.463429 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:54 crc kubenswrapper[4970]: I1128 13:19:54.463446 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.168905 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.471389 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7a8323289d03ccd7160a32fc2abc42d8f1f0758b6a69306732c78b51d29c5c3b"} Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.471452 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"971980ace67b00fd8d30fb5a178eac9e6d175dfd835c9a04986bd7c921994e33"} Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.471475 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8e90fdab347c53d77bf5b6697ec5677585a48dec66d5022b0575ced14896fec7"} Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.471489 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.471561 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.471606 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.472551 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.472719 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473387 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473438 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473455 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473470 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473514 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473536 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.473951 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.474015 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.474039 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.474283 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.474309 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.474324 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.752302 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.753655 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.753694 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.753711 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:55 crc kubenswrapper[4970]: I1128 13:19:55.753734 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.183278 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.294463 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.474518 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.474554 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.474576 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476561 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476604 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476618 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476643 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476623 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476665 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476657 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476679 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:56 crc kubenswrapper[4970]: I1128 13:19:56.476690 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:57 crc kubenswrapper[4970]: I1128 13:19:57.476809 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:19:57 crc kubenswrapper[4970]: I1128 13:19:57.478104 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:19:57 crc kubenswrapper[4970]: I1128 13:19:57.478158 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:19:57 crc kubenswrapper[4970]: I1128 13:19:57.478176 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:19:58 crc kubenswrapper[4970]: I1128 13:19:58.169242 4970 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:19:58 crc kubenswrapper[4970]: I1128 13:19:58.169367 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 28 13:19:59 crc kubenswrapper[4970]: E1128 13:19:59.444112 4970 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 28 13:20:00 crc kubenswrapper[4970]: I1128 13:20:00.100651 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 28 13:20:00 crc kubenswrapper[4970]: I1128 13:20:00.100931 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:00 crc kubenswrapper[4970]: I1128 13:20:00.102617 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:00 crc kubenswrapper[4970]: I1128 13:20:00.102680 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:00 crc kubenswrapper[4970]: I1128 13:20:00.102705 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.662391 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.662669 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.665490 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.665531 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.665543 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.670114 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:02 crc kubenswrapper[4970]: W1128 13:20:02.958068 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 28 13:20:02 crc kubenswrapper[4970]: I1128 13:20:02.958168 4970 trace.go:236] Trace[1081547087]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:52.956) (total time: 10001ms): Nov 28 13:20:02 crc kubenswrapper[4970]: Trace[1081547087]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:20:02.958) Nov 28 13:20:02 crc kubenswrapper[4970]: Trace[1081547087]: [10.001649913s] [10.001649913s] END Nov 28 13:20:02 crc kubenswrapper[4970]: E1128 13:20:02.958190 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.000555 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:03 crc kubenswrapper[4970]: W1128 13:20:03.286766 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.286876 4970 trace.go:236] Trace[1393669189]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:53.284) (total time: 10002ms): Nov 28 13:20:03 crc kubenswrapper[4970]: Trace[1393669189]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:20:03.286) Nov 28 13:20:03 crc kubenswrapper[4970]: Trace[1393669189]: [10.00212943s] [10.00212943s] END Nov 28 13:20:03 crc kubenswrapper[4970]: E1128 13:20:03.286896 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.325535 4970 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 28 13:20:03 crc kubenswrapper[4970]: W1128 13:20:03.326772 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.326890 4970 trace.go:236] Trace[2028466523]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:53.325) (total time: 10001ms): Nov 28 13:20:03 crc kubenswrapper[4970]: Trace[2028466523]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:20:03.326) Nov 28 13:20:03 crc kubenswrapper[4970]: Trace[2028466523]: [10.00152109s] [10.00152109s] END Nov 28 13:20:03 crc kubenswrapper[4970]: E1128 13:20:03.326920 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 28 13:20:03 crc kubenswrapper[4970]: W1128 13:20:03.459182 4970 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.459360 4970 trace.go:236] Trace[687776220]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Nov-2025 13:19:53.458) (total time: 10001ms): Nov 28 13:20:03 crc kubenswrapper[4970]: Trace[687776220]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (13:20:03.459) Nov 28 13:20:03 crc kubenswrapper[4970]: Trace[687776220]: [10.001066906s] [10.001066906s] END Nov 28 13:20:03 crc kubenswrapper[4970]: E1128 13:20:03.459397 4970 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.493074 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.494033 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.494082 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.494100 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.497717 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.537382 4970 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.537465 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.557805 4970 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 28 13:20:03 crc kubenswrapper[4970]: I1128 13:20:03.557874 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.433651 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.434462 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.435799 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.435921 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.436041 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.474028 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.496134 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.496271 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.497424 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.497477 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.497491 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.497920 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.497986 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.498013 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:04 crc kubenswrapper[4970]: I1128 13:20:04.510166 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.499024 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.499081 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.502819 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.503089 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.503324 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.509320 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.509366 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:05 crc kubenswrapper[4970]: I1128 13:20:05.509390 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.302421 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.302701 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.303334 4970 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.303488 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.304515 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.304560 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.304575 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.309809 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.501994 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.502727 4970 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.502836 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.503591 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.503661 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.503686 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:06 crc kubenswrapper[4970]: I1128 13:20:06.657475 4970 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 28 13:20:08 crc kubenswrapper[4970]: I1128 13:20:08.136481 4970 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 28 13:20:08 crc kubenswrapper[4970]: I1128 13:20:08.169628 4970 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:20:08 crc kubenswrapper[4970]: I1128 13:20:08.169702 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 28 13:20:08 crc kubenswrapper[4970]: I1128 13:20:08.358354 4970 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 28 13:20:08 crc kubenswrapper[4970]: E1128 13:20:08.533147 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 28 13:20:08 crc kubenswrapper[4970]: I1128 13:20:08.536936 4970 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 28 13:20:08 crc kubenswrapper[4970]: E1128 13:20:08.566988 4970 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.174838 4970 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.174893 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 28 13:20:09 crc kubenswrapper[4970]: E1128 13:20:09.444280 4970 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.513577 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.519201 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3" exitCode=255 Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.519262 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3"} Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.519446 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.528159 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.528264 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.528294 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.529420 4970 scope.go:117] "RemoveContainer" containerID="89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3" Nov 28 13:20:09 crc kubenswrapper[4970]: I1128 13:20:09.694650 4970 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.328259 4970 apiserver.go:52] "Watching apiserver" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.330755 4970 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.331685 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.332292 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.332388 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.332417 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.332292 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.332527 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.332984 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.332992 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.333023 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.333359 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.336337 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.336696 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.336844 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.336897 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.336995 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.337079 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.337365 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.337451 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.341277 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.371842 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.385890 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.402891 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.415370 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.425164 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.428173 4970 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.433658 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.443791 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.445970 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446061 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446257 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446298 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446330 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446360 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446392 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446423 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446427 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446454 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446533 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446561 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446585 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446612 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446635 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446659 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446682 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446703 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446726 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446754 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446777 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446799 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446821 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446845 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446867 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446877 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446891 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446958 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.446989 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447022 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447098 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447106 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447141 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447170 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447201 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447259 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447290 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447296 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447323 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447354 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447387 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447420 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447457 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447527 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447562 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447593 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447623 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447655 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447686 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447717 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447747 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447778 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447812 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447842 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447875 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447896 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447906 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447950 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.447977 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448000 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448024 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448046 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448067 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448090 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448111 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448134 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448187 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448230 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448281 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448306 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448331 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448356 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448378 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448401 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448422 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448444 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448467 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448490 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448515 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448538 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448525 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448562 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448585 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448608 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448633 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448640 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448655 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448712 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448748 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448781 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448926 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.448813 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449084 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449120 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449151 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449183 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449596 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449612 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449959 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.449974 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450482 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450527 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450548 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450580 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450599 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450616 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450639 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450656 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450671 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450687 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450780 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450796 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450816 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450832 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450848 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450863 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450887 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450905 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450922 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450940 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450957 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450974 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.450991 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451007 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451024 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451041 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451060 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451077 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451095 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451118 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451139 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451159 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451181 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451201 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451236 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451260 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451277 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451295 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451312 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451329 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451344 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451361 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451378 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451396 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451412 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451428 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451444 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451464 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451501 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451516 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451532 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451553 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451576 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451601 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451618 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451634 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451651 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451667 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451683 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451698 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451719 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451739 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451760 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451783 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451804 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451825 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451854 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451880 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451905 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451930 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451951 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451967 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.451984 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452002 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452021 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452043 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452065 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452089 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452109 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452127 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452143 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452159 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452174 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452191 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452230 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452256 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452279 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452302 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452326 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452352 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452373 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452396 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452422 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452443 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452459 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452477 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452493 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452510 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452528 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452544 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452561 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452580 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452601 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452624 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452646 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452669 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452690 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452741 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452771 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452799 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452829 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452861 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452882 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452904 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452921 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452939 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452961 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452978 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.452999 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453019 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453037 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453094 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453106 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453116 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453125 4970 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453135 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453144 4970 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453153 4970 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453162 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453172 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453183 4970 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453193 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453203 4970 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453460 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453625 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453692 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453727 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453735 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.453774 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454102 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454282 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454303 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454652 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454771 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454888 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455033 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455034 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.454875 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455185 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455250 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455370 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455426 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455401 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455826 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455858 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.455981 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.456200 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.456474 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.456523 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.456663 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.456655 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457390 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457458 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457478 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457485 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457481 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457671 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.457731 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458103 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458502 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458582 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458639 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458673 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458684 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458746 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458758 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.458831 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459048 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459603 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459646 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459713 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459738 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459958 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459983 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459994 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.459964 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460015 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460195 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460196 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460518 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460566 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460786 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460955 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.460978 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.461143 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.461356 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.461395 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.461421 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.461562 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.461608 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462194 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462252 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462260 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462280 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462311 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.462505 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:10.962479303 +0000 UTC m=+21.815361263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462515 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462748 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462924 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463313 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463407 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463414 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463617 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463668 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463862 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463886 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.463934 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464053 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464185 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464282 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.462293 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464379 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464505 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464555 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465111 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465140 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465148 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465434 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465564 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465502 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465590 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465707 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466060 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466355 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466370 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465948 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466442 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466382 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.464968 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.465476 4970 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.466931 4970 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466966 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466987 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.467024 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:10.966999708 +0000 UTC m=+21.819881698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.467519 4970 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.467580 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:10.967564316 +0000 UTC m=+21.820446296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466086 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.469274 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.469326 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.469622 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.469745 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.469759 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.470100 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.470479 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.470546 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.466288 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.470651 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.470767 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.489525 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.489560 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.489731 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.489743 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.489752 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.489772 4970 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.489845 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:10.989824241 +0000 UTC m=+21.842706061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.489866 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.489907 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.489948 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.490194 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.490443 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.490482 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.490795 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.490914 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.491010 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.491851 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.492587 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.493234 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.494465 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.494561 4970 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.494672 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:10.994657807 +0000 UTC m=+21.847539687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494471 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494726 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494584 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.493571 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.493674 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494123 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494163 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494180 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494313 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.494468 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.493583 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.495016 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.496882 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.496981 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.498880 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.498900 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.502454 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.502586 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.502559 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.502990 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503023 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503252 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503357 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503559 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503586 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503717 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.503978 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.504452 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.504469 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.504768 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.504801 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.506489 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507009 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507104 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507403 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507467 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507477 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507524 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507575 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507628 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507648 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.507665 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.508803 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.510354 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.510453 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.510504 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.510847 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.510836 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.511061 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.511084 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.511118 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.511164 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.511242 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.511371 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.517109 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.522364 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.523974 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0"} Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.525116 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.527700 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.535553 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.535895 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.536747 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.537788 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.548730 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554466 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554505 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554573 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554591 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554606 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554619 4970 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554633 4970 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554648 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554662 4970 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554675 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554689 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554702 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554715 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554728 4970 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554741 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554755 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554768 4970 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554780 4970 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554794 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554809 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554821 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554835 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554848 4970 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554861 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554874 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554889 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554902 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554915 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554927 4970 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554940 4970 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554953 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554967 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554983 4970 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.554995 4970 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555008 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555021 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555034 4970 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555047 4970 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555062 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555076 4970 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555089 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555102 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555115 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555127 4970 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555140 4970 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555167 4970 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555180 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555236 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555249 4970 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555263 4970 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555277 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555292 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555305 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555318 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555333 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555347 4970 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555362 4970 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555376 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555391 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555408 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555425 4970 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555439 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555452 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555466 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555479 4970 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555494 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555507 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555519 4970 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555532 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555545 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555559 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555572 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555587 4970 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555602 4970 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555616 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555631 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555644 4970 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555657 4970 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555669 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555682 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555695 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555709 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555722 4970 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555736 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555750 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555764 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555776 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555789 4970 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555804 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555817 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555831 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555845 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555858 4970 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555871 4970 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555884 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555897 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555909 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555922 4970 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555936 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555949 4970 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555962 4970 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555975 4970 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.555987 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556000 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556013 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556025 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556037 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556050 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556062 4970 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556075 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556087 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556100 4970 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556113 4970 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556126 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556138 4970 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556151 4970 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556163 4970 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556176 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556189 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556202 4970 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556236 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556248 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556261 4970 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556273 4970 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556287 4970 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556300 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556312 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556321 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556329 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556339 4970 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556348 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556357 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556366 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556374 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556382 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556403 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556412 4970 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556421 4970 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556429 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556437 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556446 4970 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556461 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556469 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556478 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556486 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556497 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556505 4970 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556515 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556525 4970 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556533 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556542 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556552 4970 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556561 4970 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556584 4970 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556593 4970 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556602 4970 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556616 4970 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556625 4970 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556633 4970 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556642 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556651 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556660 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556670 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556678 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556687 4970 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556696 4970 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556705 4970 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556713 4970 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556721 4970 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556731 4970 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556739 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556748 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556757 4970 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556766 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556775 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556784 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556792 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556801 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556809 4970 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556818 4970 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556827 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556836 4970 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556846 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556855 4970 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556898 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.556950 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.562596 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.572618 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.582797 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.591833 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.654720 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.669069 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.679052 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 28 13:20:10 crc kubenswrapper[4970]: I1128 13:20:10.965633 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:10 crc kubenswrapper[4970]: E1128 13:20:10.966031 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:11.965999094 +0000 UTC m=+22.818880934 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.067302 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.067371 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.067419 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.067455 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.067584 4970 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.067654 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:12.067633619 +0000 UTC m=+22.920515449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068198 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068275 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068303 4970 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068369 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:12.068345121 +0000 UTC m=+22.921226971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068451 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068470 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068485 4970 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068524 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:12.068511747 +0000 UTC m=+22.921393587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068570 4970 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.068607 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:12.06859629 +0000 UTC m=+22.921478130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.386509 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.387614 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.389957 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.391267 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.393286 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.394289 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.395438 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.397567 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.398876 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.400800 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.402018 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.405888 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.406891 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.407930 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.409803 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.410881 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.412800 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.413570 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.414726 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.416672 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.417597 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.419954 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.420974 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.423643 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.424559 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.425736 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.427178 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.427641 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.428824 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.429489 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.430407 4970 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.430503 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.432181 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.433168 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.433726 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.435286 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.436110 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.439374 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.440670 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.441998 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.442678 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.443932 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.444858 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.446041 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.446600 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.447534 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.448153 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.449372 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.449887 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.450819 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.451351 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.451963 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.453025 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.453757 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.528061 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a89c54ce3dd6a2a3abb5124cfbc03c0f39feb636ac5859d1e5eebde10120f87f"} Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.529955 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1799f7e3e8d1006f9c689776dc0209a0a45e3a040c5aeed29ef719208f0320be"} Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.529986 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"51b9354fa07cc095e7ac85b75a24de8813ecf768a607a548ee61c443344753ba"} Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.531840 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"007492aa3769f7765a343b0ff78727224d27d55e790f5db415ad23f286f8d268"} Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.531918 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4cc270f12503d6b89e82610e4e4ae1f481491b9119333df4334120840864a827"} Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.542858 4970 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:20:11 crc kubenswrapper[4970]: I1128 13:20:11.975444 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:11 crc kubenswrapper[4970]: E1128 13:20:11.975687 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:13.975655998 +0000 UTC m=+24.828537828 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.076617 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.076675 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.076701 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.076726 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076805 4970 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076835 4970 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076852 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076938 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076968 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076976 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076990 4970 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076989 4970 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.076867 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:14.07685114 +0000 UTC m=+24.929732940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.077100 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:14.077078858 +0000 UTC m=+24.929960668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.077118 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:14.077110549 +0000 UTC m=+24.929992369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.077133 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:14.077126099 +0000 UTC m=+24.930007919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.379798 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.379817 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.379888 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.379920 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.379969 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:12 crc kubenswrapper[4970]: E1128 13:20:12.380050 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.535403 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b553b8039ce39b53ee4fd99570a9486f387d046e2575c6b224a275be698dee30"} Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.546954 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.559514 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.572729 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007492aa3769f7765a343b0ff78727224d27d55e790f5db415ad23f286f8d268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.585696 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.596803 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.609124 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.622601 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa8646b0-d3c4-45ac-9de5-c342099d5515\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:20:09Z\\\",\\\"message\\\":\\\"le observer\\\\nW1128 13:20:08.565000 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1128 13:20:08.565307 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1128 13:20:08.566365 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2469512425/tls.crt::/tmp/serving-cert-2469512425/tls.key\\\\\\\"\\\\nI1128 13:20:09.017237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1128 13:20:09.069677 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1128 13:20:09.069747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1128 13:20:09.069785 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1128 13:20:09.069796 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1128 13:20:09.080333 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1128 13:20:09.080422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080445 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1128 13:20:09.080487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1128 13:20:09.080507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1128 13:20:09.080527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1128 13:20:09.080685 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1128 13:20:09.082579 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.632907 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b553b8039ce39b53ee4fd99570a9486f387d046e2575c6b224a275be698dee30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1799f7e3e8d1006f9c689776dc0209a0a45e3a040c5aeed29ef719208f0320be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.648181 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.660884 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007492aa3769f7765a343b0ff78727224d27d55e790f5db415ad23f286f8d268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.673400 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.690738 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.704390 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:12 crc kubenswrapper[4970]: I1128 13:20:12.720901 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa8646b0-d3c4-45ac-9de5-c342099d5515\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:20:09Z\\\",\\\"message\\\":\\\"le observer\\\\nW1128 13:20:08.565000 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1128 13:20:08.565307 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1128 13:20:08.566365 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2469512425/tls.crt::/tmp/serving-cert-2469512425/tls.key\\\\\\\"\\\\nI1128 13:20:09.017237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1128 13:20:09.069677 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1128 13:20:09.069747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1128 13:20:09.069785 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1128 13:20:09.069796 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1128 13:20:09.080333 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1128 13:20:09.080422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080445 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1128 13:20:09.080487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1128 13:20:09.080507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1128 13:20:09.080527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1128 13:20:09.080685 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1128 13:20:09.082579 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:12Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.345458 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vmzw2"] Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.345981 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.347563 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.347580 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.350633 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.364750 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b553b8039ce39b53ee4fd99570a9486f387d046e2575c6b224a275be698dee30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1799f7e3e8d1006f9c689776dc0209a0a45e3a040c5aeed29ef719208f0320be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.387103 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkx5q\" (UniqueName: \"kubernetes.io/projected/7a706317-e9d0-4a0e-be0a-f85da6e77cbd-kube-api-access-dkx5q\") pod \"node-resolver-vmzw2\" (UID: \"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\") " pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.387144 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a706317-e9d0-4a0e-be0a-f85da6e77cbd-hosts-file\") pod \"node-resolver-vmzw2\" (UID: \"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\") " pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.395511 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.416097 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.428776 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.449848 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa8646b0-d3c4-45ac-9de5-c342099d5515\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:20:09Z\\\",\\\"message\\\":\\\"le observer\\\\nW1128 13:20:08.565000 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1128 13:20:08.565307 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1128 13:20:08.566365 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2469512425/tls.crt::/tmp/serving-cert-2469512425/tls.key\\\\\\\"\\\\nI1128 13:20:09.017237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1128 13:20:09.069677 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1128 13:20:09.069747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1128 13:20:09.069785 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1128 13:20:09.069796 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1128 13:20:09.080333 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1128 13:20:09.080422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080445 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1128 13:20:09.080487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1128 13:20:09.080507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1128 13:20:09.080527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1128 13:20:09.080685 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1128 13:20:09.082579 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.450364 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-v8ddv"] Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.450686 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.456636 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.456988 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.457078 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.458185 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.465053 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.475874 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007492aa3769f7765a343b0ff78727224d27d55e790f5db415ad23f286f8d268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.483870 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vmzw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkx5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:20:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vmzw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.487897 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bg9m\" (UniqueName: \"kubernetes.io/projected/40a1e63a-aa83-4b08-aea7-9e03807f66a5-kube-api-access-8bg9m\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.487934 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40a1e63a-aa83-4b08-aea7-9e03807f66a5-host\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.487957 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/40a1e63a-aa83-4b08-aea7-9e03807f66a5-serviceca\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.488016 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a706317-e9d0-4a0e-be0a-f85da6e77cbd-hosts-file\") pod \"node-resolver-vmzw2\" (UID: \"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\") " pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.488038 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkx5q\" (UniqueName: \"kubernetes.io/projected/7a706317-e9d0-4a0e-be0a-f85da6e77cbd-kube-api-access-dkx5q\") pod \"node-resolver-vmzw2\" (UID: \"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\") " pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.488331 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a706317-e9d0-4a0e-be0a-f85da6e77cbd-hosts-file\") pod \"node-resolver-vmzw2\" (UID: \"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\") " pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.496377 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://007492aa3769f7765a343b0ff78727224d27d55e790f5db415ad23f286f8d268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.504827 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkx5q\" (UniqueName: \"kubernetes.io/projected/7a706317-e9d0-4a0e-be0a-f85da6e77cbd-kube-api-access-dkx5q\") pod \"node-resolver-vmzw2\" (UID: \"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\") " pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.506827 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.517444 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.527998 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.547752 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa8646b0-d3c4-45ac-9de5-c342099d5515\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:19:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-28T13:20:09Z\\\",\\\"message\\\":\\\"le observer\\\\nW1128 13:20:08.565000 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1128 13:20:08.565307 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1128 13:20:08.566365 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2469512425/tls.crt::/tmp/serving-cert-2469512425/tls.key\\\\\\\"\\\\nI1128 13:20:09.017237 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1128 13:20:09.069677 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1128 13:20:09.069747 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1128 13:20:09.069785 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1128 13:20:09.069796 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1128 13:20:09.080333 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1128 13:20:09.080422 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080445 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1128 13:20:09.080467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1128 13:20:09.080487 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1128 13:20:09.080507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1128 13:20:09.080527 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1128 13:20:09.080685 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1128 13:20:09.082579 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-28T13:19:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-28T13:19:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-28T13:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:19:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.562693 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.572140 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vmzw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a706317-e9d0-4a0e-be0a-f85da6e77cbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkx5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:20:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vmzw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.583011 4970 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-v8ddv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a1e63a-aa83-4b08-aea7-9e03807f66a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-28T13:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8bg9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-28T13:20:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-v8ddv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-28T13:20:13Z is after 2025-08-24T17:21:41Z" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.588429 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bg9m\" (UniqueName: \"kubernetes.io/projected/40a1e63a-aa83-4b08-aea7-9e03807f66a5-kube-api-access-8bg9m\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.588466 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40a1e63a-aa83-4b08-aea7-9e03807f66a5-host\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.588486 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/40a1e63a-aa83-4b08-aea7-9e03807f66a5-serviceca\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.588508 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40a1e63a-aa83-4b08-aea7-9e03807f66a5-host\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.589598 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/40a1e63a-aa83-4b08-aea7-9e03807f66a5-serviceca\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.607714 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bg9m\" (UniqueName: \"kubernetes.io/projected/40a1e63a-aa83-4b08-aea7-9e03807f66a5-kube-api-access-8bg9m\") pod \"node-ca-v8ddv\" (UID: \"40a1e63a-aa83-4b08-aea7-9e03807f66a5\") " pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.659513 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vmzw2" Nov 28 13:20:13 crc kubenswrapper[4970]: W1128 13:20:13.670183 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a706317_e9d0_4a0e_be0a_f85da6e77cbd.slice/crio-bd5716329de5c1c463112f414afc083912b3a57cd6b1a7fe7f56c0a973f6ab24 WatchSource:0}: Error finding container bd5716329de5c1c463112f414afc083912b3a57cd6b1a7fe7f56c0a973f6ab24: Status 404 returned error can't find the container with id bd5716329de5c1c463112f414afc083912b3a57cd6b1a7fe7f56c0a973f6ab24 Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.762563 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v8ddv" Nov 28 13:20:13 crc kubenswrapper[4970]: W1128 13:20:13.772877 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a1e63a_aa83_4b08_aea7_9e03807f66a5.slice/crio-ab7a31ce46096e325d6431cd2f4c899bd7928fdb8eddcc6ee74e022fde5c1173 WatchSource:0}: Error finding container ab7a31ce46096e325d6431cd2f4c899bd7928fdb8eddcc6ee74e022fde5c1173: Status 404 returned error can't find the container with id ab7a31ce46096e325d6431cd2f4c899bd7928fdb8eddcc6ee74e022fde5c1173 Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.825439 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tjrng"] Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.825757 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.827777 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.827805 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.827777 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.828485 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.828504 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.835505 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-krtxh"] Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.835916 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2956x"] Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.836155 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.836738 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.839116 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.839416 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.839624 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.843391 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.843430 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.843815 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.844023 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.864707 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.8646855430000002 podStartE2EDuration="3.864685543s" podCreationTimestamp="2025-11-28 13:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:13.854420354 +0000 UTC m=+24.707302164" watchObservedRunningTime="2025-11-28 13:20:13.864685543 +0000 UTC m=+24.717567343" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.865017 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6c6s9"] Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.865762 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.867545 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.867709 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.868559 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.871190 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.871317 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.871912 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.873101 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891340 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-netns\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891377 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599b7\" (UniqueName: \"kubernetes.io/projected/c3f8e57d-69cb-4704-a35b-ca570d60797e-kube-api-access-599b7\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891396 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-system-cni-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891413 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr469\" (UniqueName: \"kubernetes.io/projected/70bedd43-c527-436e-b47b-0b9ec5b10601-kube-api-access-tr469\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891427 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-os-release\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891442 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-cni-multus\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891455 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891469 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-cnibin\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891482 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-os-release\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891496 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-hostroot\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891520 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-etc-kubernetes\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891534 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-daemon-config\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891547 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-k8s-cni-cncf-io\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891563 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3f8e57d-69cb-4704-a35b-ca570d60797e-cni-binary-copy\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891576 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-cni-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891591 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-conf-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891606 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70bedd43-c527-436e-b47b-0b9ec5b10601-mcd-auth-proxy-config\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891621 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-system-cni-dir\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891635 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddf11f1e-5631-4329-9db4-b75fed094c5f-cni-binary-copy\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891665 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/70bedd43-c527-436e-b47b-0b9ec5b10601-rootfs\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891678 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-kubelet\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891690 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7jgw\" (UniqueName: \"kubernetes.io/projected/ddf11f1e-5631-4329-9db4-b75fed094c5f-kube-api-access-n7jgw\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891703 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-cnibin\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891716 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-cni-bin\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891730 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3f8e57d-69cb-4704-a35b-ca570d60797e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891746 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-socket-dir-parent\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891760 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70bedd43-c527-436e-b47b-0b9ec5b10601-proxy-tls\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.891773 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-multus-certs\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992272 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992349 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70bedd43-c527-436e-b47b-0b9ec5b10601-proxy-tls\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992371 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-multus-certs\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992392 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: E1128 13:20:13.992424 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:17.992403108 +0000 UTC m=+28.845284908 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992469 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-multus-certs\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992460 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-netns\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992511 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-netns\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992541 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-var-lib-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992597 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-config\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992620 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599b7\" (UniqueName: \"kubernetes.io/projected/c3f8e57d-69cb-4704-a35b-ca570d60797e-kube-api-access-599b7\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992638 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-system-cni-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992654 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-systemd\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992754 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-system-cni-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.992784 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr469\" (UniqueName: \"kubernetes.io/projected/70bedd43-c527-436e-b47b-0b9ec5b10601-kube-api-access-tr469\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993079 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-os-release\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993101 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-cni-multus\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993118 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-cnibin\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993133 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993154 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-cni-multus\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993169 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-cnibin\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993180 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-systemd-units\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993196 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-etc-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993239 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-bin\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993259 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-os-release\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993275 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-hostroot\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993298 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-os-release\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993317 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-env-overrides\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993332 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-hostroot\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993346 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-kubelet\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993358 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-os-release\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993376 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovn-node-metrics-cert\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993392 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-etc-kubernetes\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993406 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsjj6\" (UniqueName: \"kubernetes.io/projected/17474edc-f114-4ee6-b6bb-95b55f1731ac-kube-api-access-gsjj6\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993425 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-daemon-config\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993440 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-netns\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993459 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-etc-kubernetes\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993484 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993503 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3f8e57d-69cb-4704-a35b-ca570d60797e-cni-binary-copy\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.993628 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-cni-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994033 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-daemon-config\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994036 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3f8e57d-69cb-4704-a35b-ca570d60797e-cni-binary-copy\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994053 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-cni-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994079 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-k8s-cni-cncf-io\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994098 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994117 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-ovn\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994129 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-run-k8s-cni-cncf-io\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994131 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-log-socket\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994170 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-netd\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994189 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-conf-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994203 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-node-log\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994246 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70bedd43-c527-436e-b47b-0b9ec5b10601-mcd-auth-proxy-config\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994247 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-conf-dir\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994267 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-system-cni-dir\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994694 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70bedd43-c527-436e-b47b-0b9ec5b10601-mcd-auth-proxy-config\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994728 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddf11f1e-5631-4329-9db4-b75fed094c5f-cni-binary-copy\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.994745 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-script-lib\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995108 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddf11f1e-5631-4329-9db4-b75fed094c5f-cni-binary-copy\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995138 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7jgw\" (UniqueName: \"kubernetes.io/projected/ddf11f1e-5631-4329-9db4-b75fed094c5f-kube-api-access-n7jgw\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995169 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/70bedd43-c527-436e-b47b-0b9ec5b10601-rootfs\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995185 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-kubelet\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995201 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-cnibin\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995230 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-cni-bin\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995245 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3f8e57d-69cb-4704-a35b-ca570d60797e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995258 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-socket-dir-parent\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995274 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-slash\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995330 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-system-cni-dir\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995487 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/70bedd43-c527-436e-b47b-0b9ec5b10601-rootfs\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995512 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-kubelet\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995543 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-cnibin\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995563 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-host-var-lib-cni-bin\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995823 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddf11f1e-5631-4329-9db4-b75fed094c5f-multus-socket-dir-parent\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.995936 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3f8e57d-69cb-4704-a35b-ca570d60797e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:13 crc kubenswrapper[4970]: I1128 13:20:13.996287 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70bedd43-c527-436e-b47b-0b9ec5b10601-proxy-tls\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.011793 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7jgw\" (UniqueName: \"kubernetes.io/projected/ddf11f1e-5631-4329-9db4-b75fed094c5f-kube-api-access-n7jgw\") pod \"multus-krtxh\" (UID: \"ddf11f1e-5631-4329-9db4-b75fed094c5f\") " pod="openshift-multus/multus-krtxh" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.016947 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr469\" (UniqueName: \"kubernetes.io/projected/70bedd43-c527-436e-b47b-0b9ec5b10601-kube-api-access-tr469\") pod \"machine-config-daemon-tjrng\" (UID: \"70bedd43-c527-436e-b47b-0b9ec5b10601\") " pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.018732 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599b7\" (UniqueName: \"kubernetes.io/projected/c3f8e57d-69cb-4704-a35b-ca570d60797e-kube-api-access-599b7\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.029408 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4vr87"] Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.029784 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.029850 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.050512 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3f8e57d-69cb-4704-a35b-ca570d60797e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2956x\" (UID: \"c3f8e57d-69cb-4704-a35b-ca570d60797e\") " pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096134 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-slash\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096185 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096204 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-systemd\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096257 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-var-lib-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096277 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-config\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096295 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-systemd-units\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096308 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-etc-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096314 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-slash\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096356 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-systemd\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096324 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096359 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-etc-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096345 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-var-lib-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096454 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-bin\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096666 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-systemd-units\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096701 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-bin\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096733 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-env-overrides\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096752 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-kubelet\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.096768 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovn-node-metrics-cert\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097305 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx9x6\" (UniqueName: \"kubernetes.io/projected/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-kube-api-access-sx9x6\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097373 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097412 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsjj6\" (UniqueName: \"kubernetes.io/projected/17474edc-f114-4ee6-b6bb-95b55f1731ac-kube-api-access-gsjj6\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097482 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097496 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-env-overrides\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097497 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-config\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.097619 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097635 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-openvswitch\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.097656 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.097672 4970 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097691 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-netns\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.097910 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:18.097893218 +0000 UTC m=+28.950775018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097667 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-netns\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.097993 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-ovn\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098032 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-log-socket\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098060 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-netd\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098089 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098115 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098146 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-node-log\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098177 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-script-lib\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098237 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098274 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098311 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098588 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-ovn\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098612 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098631 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098647 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-log-socket\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098652 4970 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098677 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-node-log\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098718 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:18.098697974 +0000 UTC m=+28.951579774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098758 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-netd\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.098801 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098874 4970 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098917 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:18.0989018 +0000 UTC m=+28.951783600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.098986 4970 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.099028 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:18.099017904 +0000 UTC m=+28.951899704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.099369 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-script-lib\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.100837 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-kubelet\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.103355 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovn-node-metrics-cert\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.118672 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsjj6\" (UniqueName: \"kubernetes.io/projected/17474edc-f114-4ee6-b6bb-95b55f1731ac-kube-api-access-gsjj6\") pod \"ovnkube-node-6c6s9\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.157884 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:20:14 crc kubenswrapper[4970]: W1128 13:20:14.167514 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70bedd43_c527_436e_b47b_0b9ec5b10601.slice/crio-866c3f37c8b8878651457cd821e62342d296c454d5f88d37b414791fc1161b59 WatchSource:0}: Error finding container 866c3f37c8b8878651457cd821e62342d296c454d5f88d37b414791fc1161b59: Status 404 returned error can't find the container with id 866c3f37c8b8878651457cd821e62342d296c454d5f88d37b414791fc1161b59 Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.187308 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-krtxh" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.197241 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2956x" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.199409 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx9x6\" (UniqueName: \"kubernetes.io/projected/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-kube-api-access-sx9x6\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.199463 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.199579 4970 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.199643 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs podName:c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:14.699631167 +0000 UTC m=+25.552512967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs") pod "network-metrics-daemon-4vr87" (UID: "c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.203533 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.226170 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx9x6\" (UniqueName: \"kubernetes.io/projected/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-kube-api-access-sx9x6\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: W1128 13:20:14.227769 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf11f1e_5631_4329_9db4_b75fed094c5f.slice/crio-18243ec220fb12958df47556f91ce35c2e3b07394daafa3675b1a10b2ad606b0 WatchSource:0}: Error finding container 18243ec220fb12958df47556f91ce35c2e3b07394daafa3675b1a10b2ad606b0: Status 404 returned error can't find the container with id 18243ec220fb12958df47556f91ce35c2e3b07394daafa3675b1a10b2ad606b0 Nov 28 13:20:14 crc kubenswrapper[4970]: W1128 13:20:14.228090 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3f8e57d_69cb_4704_a35b_ca570d60797e.slice/crio-d91aec52c76207759a2c9da61b61cd0d00e5a2ee9df5b201ae0b4a126c6722c3 WatchSource:0}: Error finding container d91aec52c76207759a2c9da61b61cd0d00e5a2ee9df5b201ae0b4a126c6722c3: Status 404 returned error can't find the container with id d91aec52c76207759a2c9da61b61cd0d00e5a2ee9df5b201ae0b4a126c6722c3 Nov 28 13:20:14 crc kubenswrapper[4970]: W1128 13:20:14.231403 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17474edc_f114_4ee6_b6bb_95b55f1731ac.slice/crio-014e0aa873d6af808f944184cc135d7a3b56ce1c0ee28f1ffc24d53951b18e4a WatchSource:0}: Error finding container 014e0aa873d6af808f944184cc135d7a3b56ce1c0ee28f1ffc24d53951b18e4a: Status 404 returned error can't find the container with id 014e0aa873d6af808f944184cc135d7a3b56ce1c0ee28f1ffc24d53951b18e4a Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.320719 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5"] Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.321522 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.324361 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.324436 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.380548 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.380577 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.380554 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.380692 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.380770 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.380840 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.401739 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88f78dc3-1d71-4054-a138-f1fd29efca8c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.401794 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhw6\" (UniqueName: \"kubernetes.io/projected/88f78dc3-1d71-4054-a138-f1fd29efca8c-kube-api-access-9jhw6\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.401837 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88f78dc3-1d71-4054-a138-f1fd29efca8c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.401872 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88f78dc3-1d71-4054-a138-f1fd29efca8c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.502746 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88f78dc3-1d71-4054-a138-f1fd29efca8c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.502785 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhw6\" (UniqueName: \"kubernetes.io/projected/88f78dc3-1d71-4054-a138-f1fd29efca8c-kube-api-access-9jhw6\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.502802 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88f78dc3-1d71-4054-a138-f1fd29efca8c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.502839 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88f78dc3-1d71-4054-a138-f1fd29efca8c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.503436 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88f78dc3-1d71-4054-a138-f1fd29efca8c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.504485 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88f78dc3-1d71-4054-a138-f1fd29efca8c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.506896 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88f78dc3-1d71-4054-a138-f1fd29efca8c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.541153 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v8ddv" event={"ID":"40a1e63a-aa83-4b08-aea7-9e03807f66a5","Type":"ContainerStarted","Data":"4f63081ec30d2c1206a321f8eb7141a0f3ebdf96ce893468ec116bedce3f5ff0"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.541196 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v8ddv" event={"ID":"40a1e63a-aa83-4b08-aea7-9e03807f66a5","Type":"ContainerStarted","Data":"ab7a31ce46096e325d6431cd2f4c899bd7928fdb8eddcc6ee74e022fde5c1173"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.542897 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vmzw2" event={"ID":"7a706317-e9d0-4a0e-be0a-f85da6e77cbd","Type":"ContainerStarted","Data":"33f38ed6ee1966fd91ccb50de77f53558006e989342f4db7fa5c157e5b310b45"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.542923 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vmzw2" event={"ID":"7a706317-e9d0-4a0e-be0a-f85da6e77cbd","Type":"ContainerStarted","Data":"bd5716329de5c1c463112f414afc083912b3a57cd6b1a7fe7f56c0a973f6ab24"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.544443 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-krtxh" event={"ID":"ddf11f1e-5631-4329-9db4-b75fed094c5f","Type":"ContainerStarted","Data":"24a7c4d8c833fe666f119406934eac445c54366b38731e68e743d0c8fd524617"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.544471 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-krtxh" event={"ID":"ddf11f1e-5631-4329-9db4-b75fed094c5f","Type":"ContainerStarted","Data":"18243ec220fb12958df47556f91ce35c2e3b07394daafa3675b1a10b2ad606b0"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.546114 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" exitCode=0 Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.546151 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.546167 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"014e0aa873d6af808f944184cc135d7a3b56ce1c0ee28f1ffc24d53951b18e4a"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.548937 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"d1b7a16ad3b7d14af1b3edceab8910247a5f7ec6cc43bd9ebff6c9a95cf8b592"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.548967 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"fc9b6fc184f5dc3ba36a264ad6b3b87d8306222016e8b9eab63d75530062a2bd"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.548977 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"866c3f37c8b8878651457cd821e62342d296c454d5f88d37b414791fc1161b59"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.553320 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerStarted","Data":"3aeaa1f693e224e980ebd08f054ade0e29abd16d99c426c6d5e339c9804881c6"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.553374 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerStarted","Data":"d91aec52c76207759a2c9da61b61cd0d00e5a2ee9df5b201ae0b4a126c6722c3"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.555756 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhw6\" (UniqueName: \"kubernetes.io/projected/88f78dc3-1d71-4054-a138-f1fd29efca8c-kube-api-access-9jhw6\") pod \"ovnkube-control-plane-749d76644c-rcvf5\" (UID: \"88f78dc3-1d71-4054-a138-f1fd29efca8c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.563409 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5fe44501fd99fc77647b76cf69c4061ed3d237b4f1b683cfe6c710a01602c8f2"} Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.571160 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v8ddv" podStartSLOduration=1.571143376 podStartE2EDuration="1.571143376s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:14.570045161 +0000 UTC m=+25.422926961" watchObservedRunningTime="2025-11-28 13:20:14.571143376 +0000 UTC m=+25.424025176" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.596052 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-krtxh" podStartSLOduration=1.596029986 podStartE2EDuration="1.596029986s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:14.595277031 +0000 UTC m=+25.448158831" watchObservedRunningTime="2025-11-28 13:20:14.596029986 +0000 UTC m=+25.448911786" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.631746 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vmzw2" podStartSLOduration=1.631728243 podStartE2EDuration="1.631728243s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:14.631553947 +0000 UTC m=+25.484435757" watchObservedRunningTime="2025-11-28 13:20:14.631728243 +0000 UTC m=+25.484610043" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.675655 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.680921 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podStartSLOduration=1.680907092 podStartE2EDuration="1.680907092s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:14.680044504 +0000 UTC m=+25.532926314" watchObservedRunningTime="2025-11-28 13:20:14.680907092 +0000 UTC m=+25.533788892" Nov 28 13:20:14 crc kubenswrapper[4970]: W1128 13:20:14.689991 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f78dc3_1d71_4054_a138_f1fd29efca8c.slice/crio-65933bd9778d89e2320cbf72880ce31798d102b34f792d1d4c80449efb27d0dc WatchSource:0}: Error finding container 65933bd9778d89e2320cbf72880ce31798d102b34f792d1d4c80449efb27d0dc: Status 404 returned error can't find the container with id 65933bd9778d89e2320cbf72880ce31798d102b34f792d1d4c80449efb27d0dc Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.704058 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.704300 4970 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: E1128 13:20:14.704395 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs podName:c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:15.704374796 +0000 UTC m=+26.557256596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs") pod "network-metrics-daemon-4vr87" (UID: "c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.967651 4970 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.969997 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.970040 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.970052 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.970157 4970 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.977813 4970 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.978045 4970 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.979046 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.979079 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.979091 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.979107 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 28 13:20:14 crc kubenswrapper[4970]: I1128 13:20:14.979118 4970 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-28T13:20:14Z","lastTransitionTime":"2025-11-28T13:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.024571 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr"] Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.024946 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.026748 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.026792 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.026918 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.027309 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.108206 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.108248 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.108282 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.108308 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.108325 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.172817 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.176630 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.181892 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209379 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209424 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209487 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209512 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209554 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209687 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.209714 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.210478 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.215861 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.226393 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/874e6ffb-3a56-48e8-b9b8-536b73a90bf8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2x2lr\" (UID: \"874e6ffb-3a56-48e8-b9b8-536b73a90bf8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.317119 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" Nov 28 13:20:15 crc kubenswrapper[4970]: W1128 13:20:15.330260 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874e6ffb_3a56_48e8_b9b8_536b73a90bf8.slice/crio-1efbedafad3f78a12951f4b52078398edfb909e2a0030832e6c55352c632783e WatchSource:0}: Error finding container 1efbedafad3f78a12951f4b52078398edfb909e2a0030832e6c55352c632783e: Status 404 returned error can't find the container with id 1efbedafad3f78a12951f4b52078398edfb909e2a0030832e6c55352c632783e Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.577927 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.577989 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.578010 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.579379 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" event={"ID":"88f78dc3-1d71-4054-a138-f1fd29efca8c","Type":"ContainerStarted","Data":"65933bd9778d89e2320cbf72880ce31798d102b34f792d1d4c80449efb27d0dc"} Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.580556 4970 generic.go:334] "Generic (PLEG): container finished" podID="c3f8e57d-69cb-4704-a35b-ca570d60797e" containerID="3aeaa1f693e224e980ebd08f054ade0e29abd16d99c426c6d5e339c9804881c6" exitCode=0 Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.580608 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerDied","Data":"3aeaa1f693e224e980ebd08f054ade0e29abd16d99c426c6d5e339c9804881c6"} Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.583908 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" event={"ID":"874e6ffb-3a56-48e8-b9b8-536b73a90bf8","Type":"ContainerStarted","Data":"1efbedafad3f78a12951f4b52078398edfb909e2a0030832e6c55352c632783e"} Nov 28 13:20:15 crc kubenswrapper[4970]: E1128 13:20:15.588457 4970 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:20:15 crc kubenswrapper[4970]: I1128 13:20:15.715547 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:15 crc kubenswrapper[4970]: E1128 13:20:15.716805 4970 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:15 crc kubenswrapper[4970]: E1128 13:20:15.716859 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs podName:c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:17.716843783 +0000 UTC m=+28.569725573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs") pod "network-metrics-daemon-4vr87" (UID: "c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.379838 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.379897 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.379931 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.379902 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:16 crc kubenswrapper[4970]: E1128 13:20:16.380005 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:16 crc kubenswrapper[4970]: E1128 13:20:16.380167 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:16 crc kubenswrapper[4970]: E1128 13:20:16.380238 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:16 crc kubenswrapper[4970]: E1128 13:20:16.380394 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.593333 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" event={"ID":"88f78dc3-1d71-4054-a138-f1fd29efca8c","Type":"ContainerStarted","Data":"3533fbd05a0b183b3efec9a9ffa26949ebfc93f19e5e29fe9a5711bf9ae59305"} Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.593379 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" event={"ID":"88f78dc3-1d71-4054-a138-f1fd29efca8c","Type":"ContainerStarted","Data":"d461be5accb569967bd46bd8065a36f209e52048f5c2c25ad899cb91f28b5f29"} Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.596921 4970 generic.go:334] "Generic (PLEG): container finished" podID="c3f8e57d-69cb-4704-a35b-ca570d60797e" containerID="60c5616f43d5d438b89600ed816dd54fb0f61e45253d1ff6303d78f8762b6e7d" exitCode=0 Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.597016 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerDied","Data":"60c5616f43d5d438b89600ed816dd54fb0f61e45253d1ff6303d78f8762b6e7d"} Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.599864 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" event={"ID":"874e6ffb-3a56-48e8-b9b8-536b73a90bf8","Type":"ContainerStarted","Data":"2a4e7f0a2a7b6cfc3ebaf541c1527c589ce6e7964767171ceb6073bf9c757358"} Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.605819 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.605857 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.612086 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.612072581 podStartE2EDuration="1.612072581s" podCreationTimestamp="2025-11-28 13:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:15.628652248 +0000 UTC m=+26.481534128" watchObservedRunningTime="2025-11-28 13:20:16.612072581 +0000 UTC m=+27.464954391" Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.612699 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rcvf5" podStartSLOduration=2.612689301 podStartE2EDuration="2.612689301s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:16.611688309 +0000 UTC m=+27.464570119" watchObservedRunningTime="2025-11-28 13:20:16.612689301 +0000 UTC m=+27.465571111" Nov 28 13:20:16 crc kubenswrapper[4970]: I1128 13:20:16.629391 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2x2lr" podStartSLOduration=3.6293716270000003 podStartE2EDuration="3.629371627s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:16.628681145 +0000 UTC m=+27.481562955" watchObservedRunningTime="2025-11-28 13:20:16.629371627 +0000 UTC m=+27.482253437" Nov 28 13:20:17 crc kubenswrapper[4970]: I1128 13:20:17.613169 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} Nov 28 13:20:17 crc kubenswrapper[4970]: I1128 13:20:17.615272 4970 generic.go:334] "Generic (PLEG): container finished" podID="c3f8e57d-69cb-4704-a35b-ca570d60797e" containerID="243705519e3a1531a3ea8207e19f188747b6b1a98bc16d361930621b4e7f9e39" exitCode=0 Nov 28 13:20:17 crc kubenswrapper[4970]: I1128 13:20:17.615406 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerDied","Data":"243705519e3a1531a3ea8207e19f188747b6b1a98bc16d361930621b4e7f9e39"} Nov 28 13:20:17 crc kubenswrapper[4970]: I1128 13:20:17.736496 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:17 crc kubenswrapper[4970]: E1128 13:20:17.737300 4970 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:17 crc kubenswrapper[4970]: E1128 13:20:17.737393 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs podName:c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:21.737364443 +0000 UTC m=+32.590246333 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs") pod "network-metrics-daemon-4vr87" (UID: "c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.038810 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.039067 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:26.039027887 +0000 UTC m=+36.891909727 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.140507 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.140555 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.140587 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.140621 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140648 4970 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140720 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140721 4970 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140756 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:26.140727255 +0000 UTC m=+36.993609095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140721 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140816 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:26.140792818 +0000 UTC m=+36.993674648 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140829 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140845 4970 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140738 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140898 4970 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140908 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:26.140895841 +0000 UTC m=+36.993777761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.140936 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:26.140922482 +0000 UTC m=+36.993804372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.380332 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.380517 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.380568 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.380602 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.380594 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.380755 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.380849 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:18 crc kubenswrapper[4970]: E1128 13:20:18.380960 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.623161 4970 generic.go:334] "Generic (PLEG): container finished" podID="c3f8e57d-69cb-4704-a35b-ca570d60797e" containerID="dba7f9432a0802e4bd93f8f15de22136ccc6d080ae00bb964bc82592593c9e53" exitCode=0 Nov 28 13:20:18 crc kubenswrapper[4970]: I1128 13:20:18.623203 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerDied","Data":"dba7f9432a0802e4bd93f8f15de22136ccc6d080ae00bb964bc82592593c9e53"} Nov 28 13:20:19 crc kubenswrapper[4970]: I1128 13:20:19.630991 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} Nov 28 13:20:19 crc kubenswrapper[4970]: I1128 13:20:19.633561 4970 generic.go:334] "Generic (PLEG): container finished" podID="c3f8e57d-69cb-4704-a35b-ca570d60797e" containerID="4d8db95071c41e7e55134b495a117de0ade653103efb99b480b5dff658afce7b" exitCode=0 Nov 28 13:20:19 crc kubenswrapper[4970]: I1128 13:20:19.633612 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerDied","Data":"4d8db95071c41e7e55134b495a117de0ade653103efb99b480b5dff658afce7b"} Nov 28 13:20:20 crc kubenswrapper[4970]: I1128 13:20:20.380687 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:20 crc kubenswrapper[4970]: I1128 13:20:20.380744 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:20 crc kubenswrapper[4970]: E1128 13:20:20.381288 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:20 crc kubenswrapper[4970]: I1128 13:20:20.380820 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:20 crc kubenswrapper[4970]: I1128 13:20:20.380769 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:20 crc kubenswrapper[4970]: E1128 13:20:20.381416 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:20 crc kubenswrapper[4970]: E1128 13:20:20.381113 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:20 crc kubenswrapper[4970]: E1128 13:20:20.381486 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:20 crc kubenswrapper[4970]: I1128 13:20:20.644855 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerStarted","Data":"91bbeb9c78e4ece618f6b22c76716d66cb4e59043129664be811a60a9b7bd170"} Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.655962 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerStarted","Data":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.656254 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.664282 4970 generic.go:334] "Generic (PLEG): container finished" podID="c3f8e57d-69cb-4704-a35b-ca570d60797e" containerID="91bbeb9c78e4ece618f6b22c76716d66cb4e59043129664be811a60a9b7bd170" exitCode=0 Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.664339 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerDied","Data":"91bbeb9c78e4ece618f6b22c76716d66cb4e59043129664be811a60a9b7bd170"} Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.682804 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.695197 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podStartSLOduration=8.695166928 podStartE2EDuration="8.695166928s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:21.694060423 +0000 UTC m=+32.546942293" watchObservedRunningTime="2025-11-28 13:20:21.695166928 +0000 UTC m=+32.548048778" Nov 28 13:20:21 crc kubenswrapper[4970]: I1128 13:20:21.789036 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:21 crc kubenswrapper[4970]: E1128 13:20:21.789833 4970 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:21 crc kubenswrapper[4970]: E1128 13:20:21.789916 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs podName:c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.789897382 +0000 UTC m=+40.642779182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs") pod "network-metrics-daemon-4vr87" (UID: "c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.379857 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.379883 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.379941 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.379990 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:22 crc kubenswrapper[4970]: E1128 13:20:22.380153 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:22 crc kubenswrapper[4970]: E1128 13:20:22.380757 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:22 crc kubenswrapper[4970]: E1128 13:20:22.380868 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:22 crc kubenswrapper[4970]: E1128 13:20:22.380955 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.675114 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2956x" event={"ID":"c3f8e57d-69cb-4704-a35b-ca570d60797e","Type":"ContainerStarted","Data":"b1cf13e2d99ac5ef18efabd880bb1a13c59f78152f861a7e70215fcf5544d239"} Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.675204 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.675264 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.710755 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.754578 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2956x" podStartSLOduration=9.754552861 podStartE2EDuration="9.754552861s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:22.70879064 +0000 UTC m=+33.561672500" watchObservedRunningTime="2025-11-28 13:20:22.754552861 +0000 UTC m=+33.607434691" Nov 28 13:20:22 crc kubenswrapper[4970]: I1128 13:20:22.843135 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:20:23 crc kubenswrapper[4970]: I1128 13:20:23.334619 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vr87"] Nov 28 13:20:23 crc kubenswrapper[4970]: I1128 13:20:23.335043 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:23 crc kubenswrapper[4970]: E1128 13:20:23.335307 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:24 crc kubenswrapper[4970]: I1128 13:20:24.379846 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:24 crc kubenswrapper[4970]: I1128 13:20:24.379921 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:24 crc kubenswrapper[4970]: E1128 13:20:24.380296 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 28 13:20:24 crc kubenswrapper[4970]: E1128 13:20:24.380456 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 28 13:20:24 crc kubenswrapper[4970]: I1128 13:20:24.380014 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:24 crc kubenswrapper[4970]: E1128 13:20:24.380576 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.380285 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:25 crc kubenswrapper[4970]: E1128 13:20:25.380525 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vr87" podUID="c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.856909 4970 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.857499 4970 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.914806 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-78pq5"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.915778 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.918362 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wms6k"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.919209 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.919308 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cbvqk"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.920478 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.923115 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.923872 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.924516 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.925454 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.929897 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.930164 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.930589 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.930701 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.931139 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.931254 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.931535 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.931894 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.932281 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.932614 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.933017 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 28 13:20:25 crc kubenswrapper[4970]: W1128 13:20:25.933063 4970 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 28 13:20:25 crc kubenswrapper[4970]: E1128 13:20:25.933119 4970 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.933398 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.933729 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.934047 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: W1128 13:20:25.934090 4970 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 28 13:20:25 crc kubenswrapper[4970]: E1128 13:20:25.934155 4970 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.934055 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.934468 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: W1128 13:20:25.934534 4970 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 28 13:20:25 crc kubenswrapper[4970]: E1128 13:20:25.934873 4970 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.951202 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.951384 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.951525 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.952002 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.952144 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.952340 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.952776 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.952928 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.953510 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.953673 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.953814 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954023 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdzrp"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954111 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954207 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954376 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954384 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954509 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.954571 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.960902 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.964044 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.987584 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9chbz"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.987860 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.988139 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.988491 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kwnx5"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.988927 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.989589 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.990094 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.990169 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xzv2b"] Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.990717 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.992167 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.993779 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 28 13:20:25 crc kubenswrapper[4970]: I1128 13:20:25.993823 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.993870 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.993914 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.994683 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.994724 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.994753 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.994862 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.994898 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.997018 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pzs29"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.994937 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.995763 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.997618 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.997966 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.998430 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.998457 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:25.999704 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.001908 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.004952 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.009470 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.009827 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q9h56"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.018355 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.032655 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.034679 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6tn57"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.036066 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.036622 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.037150 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.037276 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.037343 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.037484 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.037739 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.052561 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.052641 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054099 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054163 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054243 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054331 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xqmqh"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054367 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-config\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054390 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054406 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-image-import-ca\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054422 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njb4z\" (UniqueName: \"kubernetes.io/projected/3c65c694-b05d-40db-a754-9b530aadc7a7-kube-api-access-njb4z\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054446 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c65c694-b05d-40db-a754-9b530aadc7a7-audit-dir\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054462 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.054533 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.054519115 +0000 UTC m=+52.907400915 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054461 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5998853c-3fbb-403e-b222-5a5c939dbb58-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054580 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vglkf\" (UniqueName: \"kubernetes.io/projected/5998853c-3fbb-403e-b222-5a5c939dbb58-kube-api-access-vglkf\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054597 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054618 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054633 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054648 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pqp\" (UniqueName: \"kubernetes.io/projected/e80ce492-28d4-40cf-8a55-5a4f456e8255-kube-api-access-45pqp\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054666 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b7817f-3229-472a-b433-c2173e7abf6c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054672 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054681 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b7817f-3229-472a-b433-c2173e7abf6c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054696 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-encryption-config\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054733 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054754 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054772 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdc8\" (UniqueName: \"kubernetes.io/projected/336437f6-aba2-46ae-bf5f-2555d2db13fb-kube-api-access-jwdc8\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054791 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054812 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-serving-cert\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054828 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054844 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054863 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054877 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054902 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-etcd-client\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054936 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-config\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054947 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.054956 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055247 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5998853c-3fbb-403e-b222-5a5c939dbb58-images\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055279 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055312 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-serving-cert\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055338 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055367 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-audit\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055389 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tmnn\" (UniqueName: \"kubernetes.io/projected/65101460-48b8-4bd6-82b0-4f5bd4254ec5-kube-api-access-7tmnn\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055421 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5998853c-3fbb-403e-b222-5a5c939dbb58-config\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055445 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055466 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055497 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/336437f6-aba2-46ae-bf5f-2555d2db13fb-node-pullsecrets\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055519 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-encryption-config\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055541 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qhd\" (UniqueName: \"kubernetes.io/projected/e1b7817f-3229-472a-b433-c2173e7abf6c-kube-api-access-79qhd\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055562 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-dir\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055583 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055313 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.056152 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.056606 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.056801 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.057027 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.057170 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.057397 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.057600 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.057740 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.057876 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058008 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058140 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058309 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058450 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058470 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058721 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058830 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.058928 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059021 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059128 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059395 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059575 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059653 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059659 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059764 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059872 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059877 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.059969 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060019 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.055584 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-audit-policies\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060140 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060179 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-etcd-client\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060207 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/336437f6-aba2-46ae-bf5f-2555d2db13fb-audit-dir\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060267 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060293 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060292 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-policies\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060425 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060538 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060706 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060975 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.060993 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.065253 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.067341 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.068392 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.068618 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.068717 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.068799 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.068932 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.068979 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.069234 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.069328 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.069413 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.070390 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.073344 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.076016 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.076536 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.076888 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cbvqk"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.077752 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.078380 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.078956 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.087322 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.087864 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.088097 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.092802 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.093593 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.096446 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.097125 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjj4p"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.099792 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.100375 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.104787 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.112269 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.112302 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.112721 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.112914 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.113149 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.114898 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wms6k"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.114964 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.116169 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-78pq5"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.118139 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.118993 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kfmkb"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.119607 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.120266 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.120926 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.121741 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdzrp"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.123776 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.124169 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.124534 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.124906 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.125495 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p47s7"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.126274 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.126498 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.127407 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pldh4"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.127926 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.128492 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nccsb"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.128947 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.129416 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.130059 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.131027 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-l7c64"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.131412 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.131586 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7bxck"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.131920 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.133268 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.133294 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xqmqh"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.134159 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.135070 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9chbz"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.136027 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.137051 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.137562 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.138303 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.139390 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.140449 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pzs29"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.141388 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.142162 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8btsr"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.142647 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.143324 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q9h56"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.144186 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.145012 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.145888 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kwnx5"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.146874 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjj4p"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.147743 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xzv2b"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.149156 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.150506 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2qght"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.153014 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.155497 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.156982 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.158271 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kfmkb"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.164592 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.164859 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-config\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165146 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165179 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5998853c-3fbb-403e-b222-5a5c939dbb58-images\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165208 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165252 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165281 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165496 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-serving-cert\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165539 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-serving-cert\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165571 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165734 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hvcsr\" (UID: \"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165765 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnh6w\" (UniqueName: \"kubernetes.io/projected/79187155-9c7e-48a9-a3f8-3bcf8d921be6-kube-api-access-qnh6w\") pod \"downloads-7954f5f757-9chbz\" (UID: \"79187155-9c7e-48a9-a3f8-3bcf8d921be6\") " pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165839 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tmnn\" (UniqueName: \"kubernetes.io/projected/65101460-48b8-4bd6-82b0-4f5bd4254ec5-kube-api-access-7tmnn\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165876 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-serving-cert\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.165942 4970 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166004 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/19ff5c79-1e07-4c43-8d35-bdf19869c72b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mdr5z\" (UID: \"19ff5c79-1e07-4c43-8d35-bdf19869c72b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.166021 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.165995602 +0000 UTC m=+53.018877402 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166053 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-audit\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166115 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-service-ca\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166196 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2347a213-dd3e-4f1c-b36b-a8345bedb927-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.165836 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-config\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166659 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973ba3b3-d07b-40ef-8419-40e19838e816-service-ca-bundle\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166760 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd2da9c-112f-4043-af6f-a661a475cc2d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166815 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f85acbe7-b253-4d4a-847f-05845804f712-trusted-ca\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166848 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e1ef8e-6fc5-491d-b658-b812cc556f67-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166899 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-stats-auth\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166934 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5998853c-3fbb-403e-b222-5a5c939dbb58-config\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.167467 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.166749 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5998853c-3fbb-403e-b222-5a5c939dbb58-images\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.167727 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.167779 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.167832 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/336437f6-aba2-46ae-bf5f-2555d2db13fb-node-pullsecrets\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.167891 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.167937 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-encryption-config\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168175 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qhd\" (UniqueName: \"kubernetes.io/projected/e1b7817f-3229-472a-b433-c2173e7abf6c-kube-api-access-79qhd\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168250 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-dir\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168290 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-client\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168317 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85acbe7-b253-4d4a-847f-05845804f712-metrics-tls\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168321 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/336437f6-aba2-46ae-bf5f-2555d2db13fb-node-pullsecrets\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168354 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168397 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-client-ca\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168440 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqv54\" (UniqueName: \"kubernetes.io/projected/aa0691f5-c326-4370-a718-bc82bcfbdd78-kube-api-access-pqv54\") pod \"dns-operator-744455d44c-q9h56\" (UID: \"aa0691f5-c326-4370-a718-bc82bcfbdd78\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168485 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-audit-policies\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168516 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168517 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-audit\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168554 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-etcd-client\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.168698 4970 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168761 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-config\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168797 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6tp\" (UniqueName: \"kubernetes.io/projected/b0e1ef8e-6fc5-491d-b658-b812cc556f67-kube-api-access-wg6tp\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168835 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0e1ef8e-6fc5-491d-b658-b812cc556f67-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168906 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44b352f9-ee16-45a4-9674-e954eeed9c6c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168936 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkxm\" (UniqueName: \"kubernetes.io/projected/44b352f9-ee16-45a4-9674-e954eeed9c6c-kube-api-access-zqkxm\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168917 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-dir\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.168975 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/336437f6-aba2-46ae-bf5f-2555d2db13fb-audit-dir\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169010 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169034 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-policies\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169112 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-config\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169405 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-audit-policies\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169527 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc558\" (UniqueName: \"kubernetes.io/projected/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-kube-api-access-dc558\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169557 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-default-certificate\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169590 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rchp7\" (UniqueName: \"kubernetes.io/projected/973ba3b3-d07b-40ef-8419-40e19838e816-kube-api-access-rchp7\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169629 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-serving-cert\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169660 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hsc\" (UniqueName: \"kubernetes.io/projected/3bf3cce7-2248-44a8-a39d-54860c49fb5f-kube-api-access-94hsc\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169689 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-config\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169711 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169742 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-image-import-ca\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169764 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njb4z\" (UniqueName: \"kubernetes.io/projected/3c65c694-b05d-40db-a754-9b530aadc7a7-kube-api-access-njb4z\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169783 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-service-ca\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169809 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a678474-b488-4785-a87c-70df116b33c9-config\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.169898 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.169886857 +0000 UTC m=+53.022768657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.170148 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-policies\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.169809 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5998853c-3fbb-403e-b222-5a5c939dbb58-config\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.170410 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/336437f6-aba2-46ae-bf5f-2555d2db13fb-audit-dir\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.170725 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.170844 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.171741 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.172046 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-config\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.173636 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.173829 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cd2da9c-112f-4043-af6f-a661a475cc2d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.173789 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.173885 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c65c694-b05d-40db-a754-9b530aadc7a7-audit-dir\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.173922 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf3cce7-2248-44a8-a39d-54860c49fb5f-serving-cert\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.174487 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175004 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-serving-cert\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175509 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175573 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa0691f5-c326-4370-a718-bc82bcfbdd78-metrics-tls\") pod \"dns-operator-744455d44c-q9h56\" (UID: \"aa0691f5-c326-4370-a718-bc82bcfbdd78\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175608 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c65c694-b05d-40db-a754-9b530aadc7a7-audit-dir\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175719 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5998853c-3fbb-403e-b222-5a5c939dbb58-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175870 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vglkf\" (UniqueName: \"kubernetes.io/projected/5998853c-3fbb-403e-b222-5a5c939dbb58-kube-api-access-vglkf\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.175880 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.175899 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.175933 4970 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.175986 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-config\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176035 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-serving-cert\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176087 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e77b7513-27ee-47a7-b39a-a11dd78a0500-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.176148 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.17609885 +0000 UTC m=+53.028980650 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176179 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2347a213-dd3e-4f1c-b36b-a8345bedb927-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176270 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176327 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-oauth-serving-cert\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176437 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c65c694-b05d-40db-a754-9b530aadc7a7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176690 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e77b7513-27ee-47a7-b39a-a11dd78a0500-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176866 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-image-import-ca\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.176787 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz5k5\" (UniqueName: \"kubernetes.io/projected/0c874537-15c1-4f94-b6cd-086c9c1762f0-kube-api-access-fz5k5\") pod \"migrator-59844c95c7-xbnhw\" (UID: \"0c874537-15c1-4f94-b6cd-086c9c1762f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177004 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44b352f9-ee16-45a4-9674-e954eeed9c6c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177070 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hnf9\" (UniqueName: \"kubernetes.io/projected/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-kube-api-access-9hnf9\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177138 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-service-ca-bundle\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177164 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpv9z\" (UniqueName: \"kubernetes.io/projected/900e9596-8294-4c4d-857a-1b2bf9adaca7-kube-api-access-wpv9z\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177305 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177456 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-oauth-config\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177568 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f85acbe7-b253-4d4a-847f-05845804f712-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177695 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177806 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pqp\" (UniqueName: \"kubernetes.io/projected/e80ce492-28d4-40cf-8a55-5a4f456e8255-kube-api-access-45pqp\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177923 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b7817f-3229-472a-b433-c2173e7abf6c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.178065 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b7817f-3229-472a-b433-c2173e7abf6c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.178191 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-config\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177873 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.178390 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.177930 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.178404 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-encryption-config\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.178885 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.178998 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8gn\" (UniqueName: \"kubernetes.io/projected/19ff5c79-1e07-4c43-8d35-bdf19869c72b-kube-api-access-rr8gn\") pod \"cluster-samples-operator-665b6dd947-mdr5z\" (UID: \"19ff5c79-1e07-4c43-8d35-bdf19869c72b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.179107 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a678474-b488-4785-a87c-70df116b33c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.179278 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.179334 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpp5l\" (UniqueName: \"kubernetes.io/projected/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-kube-api-access-qpp5l\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.179384 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-trusted-ca-bundle\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.179694 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-etcd-client\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.179932 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.180251 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.180640 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmv6\" (UniqueName: \"kubernetes.io/projected/f85acbe7-b253-4d4a-847f-05845804f712-kube-api-access-5qmv6\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.180676 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-metrics-certs\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.180844 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.180867 4970 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.180878 4970 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:26 crc kubenswrapper[4970]: E1128 13:20:26.180920 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:42.180907742 +0000 UTC m=+53.033789542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.180750 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181031 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-ca\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181065 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdc8\" (UniqueName: \"kubernetes.io/projected/336437f6-aba2-46ae-bf5f-2555d2db13fb-kube-api-access-jwdc8\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181091 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v7cs\" (UniqueName: \"kubernetes.io/projected/cbdc9822-68a6-4bff-b373-cac82f25f4d3-kube-api-access-5v7cs\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181117 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181133 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181174 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzjd\" (UniqueName: \"kubernetes.io/projected/2347a213-dd3e-4f1c-b36b-a8345bedb927-kube-api-access-rtzjd\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181243 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsklx\" (UniqueName: \"kubernetes.io/projected/b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894-kube-api-access-hsklx\") pod \"control-plane-machine-set-operator-78cbb6b69f-hvcsr\" (UID: \"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181279 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181341 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-serving-cert\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181390 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181419 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77b7513-27ee-47a7-b39a-a11dd78a0500-config\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181450 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900e9596-8294-4c4d-857a-1b2bf9adaca7-serving-cert\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181486 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181715 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.181913 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b7817f-3229-472a-b433-c2173e7abf6c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182022 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a678474-b488-4785-a87c-70df116b33c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182065 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/44b352f9-ee16-45a4-9674-e954eeed9c6c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182357 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-etcd-client\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182349 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182376 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-encryption-config\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182660 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.182713 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-encryption-config\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.183004 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-config\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.183080 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-trusted-ca\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.183102 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b7817f-3229-472a-b433-c2173e7abf6c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.183197 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd2da9c-112f-4043-af6f-a661a475cc2d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.183377 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/336437f6-aba2-46ae-bf5f-2555d2db13fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.185499 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/336437f6-aba2-46ae-bf5f-2555d2db13fb-etcd-client\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.189061 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c65c694-b05d-40db-a754-9b530aadc7a7-serving-cert\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.189137 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.189342 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.189454 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5998853c-3fbb-403e-b222-5a5c939dbb58-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.189626 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.190018 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.190769 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2qght"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.191790 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.192800 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7bxck"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.193789 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p47s7"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.194693 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pldh4"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.195594 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.197356 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.197455 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nccsb"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.197687 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.198557 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kdkxk"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.199441 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.199468 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kdkxk"] Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.218600 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.241016 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.258667 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.278050 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289773 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-serving-cert\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289816 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e77b7513-27ee-47a7-b39a-a11dd78a0500-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289840 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2347a213-dd3e-4f1c-b36b-a8345bedb927-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289869 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-oauth-serving-cert\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289906 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44b352f9-ee16-45a4-9674-e954eeed9c6c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289945 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6s4p\" (UniqueName: \"kubernetes.io/projected/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-kube-api-access-q6s4p\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.289980 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hnf9\" (UniqueName: \"kubernetes.io/projected/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-kube-api-access-9hnf9\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290037 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-service-ca-bundle\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290061 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e2746d8-5b75-43dc-80f4-81f69c38bb35-signing-key\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290085 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-cert\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290117 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx24w\" (UniqueName: \"kubernetes.io/projected/a5b9dda0-da70-4e7c-850b-de8b7744a15c-kube-api-access-gx24w\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290144 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a678474-b488-4785-a87c-70df116b33c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290243 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmv6\" (UniqueName: \"kubernetes.io/projected/f85acbe7-b253-4d4a-847f-05845804f712-kube-api-access-5qmv6\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290279 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-metrics-certs\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290312 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-ca\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290338 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lskfl\" (UniqueName: \"kubernetes.io/projected/34a23190-6bba-4504-ada1-0724c8a4f1df-kube-api-access-lskfl\") pod \"package-server-manager-789f6589d5-t7pk6\" (UID: \"34a23190-6bba-4504-ada1-0724c8a4f1df\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290365 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-webhook-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290398 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzjd\" (UniqueName: \"kubernetes.io/projected/2347a213-dd3e-4f1c-b36b-a8345bedb927-kube-api-access-rtzjd\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290445 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsklx\" (UniqueName: \"kubernetes.io/projected/b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894-kube-api-access-hsklx\") pod \"control-plane-machine-set-operator-78cbb6b69f-hvcsr\" (UID: \"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290480 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77b7513-27ee-47a7-b39a-a11dd78a0500-config\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290519 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900e9596-8294-4c4d-857a-1b2bf9adaca7-serving-cert\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290543 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a678474-b488-4785-a87c-70df116b33c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290560 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-trusted-ca\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290582 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-config\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290602 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-registration-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290632 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-serving-cert\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290650 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-certs\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290665 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b47zt\" (UniqueName: \"kubernetes.io/projected/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-kube-api-access-b47zt\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290681 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnh6w\" (UniqueName: \"kubernetes.io/projected/79187155-9c7e-48a9-a3f8-3bcf8d921be6-kube-api-access-qnh6w\") pod \"downloads-7954f5f757-9chbz\" (UID: \"79187155-9c7e-48a9-a3f8-3bcf8d921be6\") " pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290746 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/19ff5c79-1e07-4c43-8d35-bdf19869c72b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mdr5z\" (UID: \"19ff5c79-1e07-4c43-8d35-bdf19869c72b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290783 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e2746d8-5b75-43dc-80f4-81f69c38bb35-signing-cabundle\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290799 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5d8594d0-33ac-470e-b2d5-65d3afaa625b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290822 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2347a213-dd3e-4f1c-b36b-a8345bedb927-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290838 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd2da9c-112f-4043-af6f-a661a475cc2d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290855 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290871 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-machine-approver-tls\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290890 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-stats-auth\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290906 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqprh\" (UniqueName: \"kubernetes.io/projected/15dfbf5e-77ac-477c-89f7-b1e035a219c0-kube-api-access-rqprh\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290925 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85acbe7-b253-4d4a-847f-05845804f712-metrics-tls\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290941 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-client-ca\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290957 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-config\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290972 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6tp\" (UniqueName: \"kubernetes.io/projected/b0e1ef8e-6fc5-491d-b658-b812cc556f67-kube-api-access-wg6tp\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.290989 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60c525e-c2a9-4977-ae16-e2be015eab30-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291008 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44b352f9-ee16-45a4-9674-e954eeed9c6c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291023 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkxm\" (UniqueName: \"kubernetes.io/projected/44b352f9-ee16-45a4-9674-e954eeed9c6c-kube-api-access-zqkxm\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291037 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-socket-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291080 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff1b626-d50c-4608-be78-c27b787cc369-proxy-tls\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291168 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc558\" (UniqueName: \"kubernetes.io/projected/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-kube-api-access-dc558\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291191 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rchp7\" (UniqueName: \"kubernetes.io/projected/973ba3b3-d07b-40ef-8419-40e19838e816-kube-api-access-rchp7\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291269 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94hsc\" (UniqueName: \"kubernetes.io/projected/3bf3cce7-2248-44a8-a39d-54860c49fb5f-kube-api-access-94hsc\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291320 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-metrics-tls\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291351 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-service-ca\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291430 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a678474-b488-4785-a87c-70df116b33c9-config\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291455 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cd2da9c-112f-4043-af6f-a661a475cc2d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291483 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-oauth-serving-cert\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291507 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhx5j\" (UniqueName: \"kubernetes.io/projected/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-kube-api-access-fhx5j\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291800 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-trusted-ca\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.291661 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-service-ca-bundle\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292302 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a23190-6bba-4504-ada1-0724c8a4f1df-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t7pk6\" (UID: \"34a23190-6bba-4504-ada1-0724c8a4f1df\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292365 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-config\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292390 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvt9l\" (UniqueName: \"kubernetes.io/projected/849a6379-8100-4799-aa30-06c1359673b7-kube-api-access-zvt9l\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292304 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-config\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292479 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szm5d\" (UniqueName: \"kubernetes.io/projected/a60c525e-c2a9-4977-ae16-e2be015eab30-kube-api-access-szm5d\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292521 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nzc\" (UniqueName: \"kubernetes.io/projected/5d8594d0-33ac-470e-b2d5-65d3afaa625b-kube-api-access-v6nzc\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292550 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e77b7513-27ee-47a7-b39a-a11dd78a0500-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292574 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz5k5\" (UniqueName: \"kubernetes.io/projected/0c874537-15c1-4f94-b6cd-086c9c1762f0-kube-api-access-fz5k5\") pod \"migrator-59844c95c7-xbnhw\" (UID: \"0c874537-15c1-4f94-b6cd-086c9c1762f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292735 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292792 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a5b9dda0-da70-4e7c-850b-de8b7744a15c-ready\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292821 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44b352f9-ee16-45a4-9674-e954eeed9c6c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292876 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpv9z\" (UniqueName: \"kubernetes.io/projected/900e9596-8294-4c4d-857a-1b2bf9adaca7-kube-api-access-wpv9z\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292908 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-secret-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292933 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5d8594d0-33ac-470e-b2d5-65d3afaa625b-srv-cert\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292966 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-oauth-config\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.292994 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f85acbe7-b253-4d4a-847f-05845804f712-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293019 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-config\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293045 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ff1b626-d50c-4608-be78-c27b787cc369-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293077 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr8gn\" (UniqueName: \"kubernetes.io/projected/19ff5c79-1e07-4c43-8d35-bdf19869c72b-kube-api-access-rr8gn\") pod \"cluster-samples-operator-665b6dd947-mdr5z\" (UID: \"19ff5c79-1e07-4c43-8d35-bdf19869c72b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293112 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpp5l\" (UniqueName: \"kubernetes.io/projected/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-kube-api-access-qpp5l\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293135 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-trusted-ca-bundle\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293165 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6ff1b626-d50c-4608-be78-c27b787cc369-images\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293192 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v7cs\" (UniqueName: \"kubernetes.io/projected/cbdc9822-68a6-4bff-b373-cac82f25f4d3-kube-api-access-5v7cs\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293229 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-config-volume\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293254 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293283 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4gq\" (UniqueName: \"kubernetes.io/projected/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-kube-api-access-cp4gq\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293319 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/44b352f9-ee16-45a4-9674-e954eeed9c6c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293342 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd2da9c-112f-4043-af6f-a661a475cc2d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293364 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-mountpoint-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293387 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7m4\" (UniqueName: \"kubernetes.io/projected/514859c2-bd3c-4ccb-90b0-61180a1bc297-kube-api-access-jt7m4\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293422 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-auth-proxy-config\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293450 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hvcsr\" (UID: \"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293473 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293500 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-serving-cert\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293519 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-service-ca\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293535 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973ba3b3-d07b-40ef-8419-40e19838e816-service-ca-bundle\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293552 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f85acbe7-b253-4d4a-847f-05845804f712-trusted-ca\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293570 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e1ef8e-6fc5-491d-b658-b812cc556f67-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293590 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-node-bootstrap-token\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293619 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293645 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293667 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgfk\" (UniqueName: \"kubernetes.io/projected/6ff1b626-d50c-4608-be78-c27b787cc369-kube-api-access-rvgfk\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293689 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-client\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293712 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293735 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqv54\" (UniqueName: \"kubernetes.io/projected/aa0691f5-c326-4370-a718-bc82bcfbdd78-kube-api-access-pqv54\") pod \"dns-operator-744455d44c-q9h56\" (UID: \"aa0691f5-c326-4370-a718-bc82bcfbdd78\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293762 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0e1ef8e-6fc5-491d-b658-b812cc556f67-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293786 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-config\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293810 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/849a6379-8100-4799-aa30-06c1359673b7-tmpfs\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293835 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-serving-cert\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293857 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-default-certificate\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293880 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-plugins-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293903 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zmp\" (UniqueName: \"kubernetes.io/projected/3e2746d8-5b75-43dc-80f4-81f69c38bb35-kube-api-access-65zmp\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293926 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-config\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293951 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5b9dda0-da70-4e7c-850b-de8b7744a15c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.293983 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-csi-data-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294006 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf3cce7-2248-44a8-a39d-54860c49fb5f-serving-cert\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294039 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa0691f5-c326-4370-a718-bc82bcfbdd78-metrics-tls\") pod \"dns-operator-744455d44c-q9h56\" (UID: \"aa0691f5-c326-4370-a718-bc82bcfbdd78\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294062 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsz8x\" (UniqueName: \"kubernetes.io/projected/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-kube-api-access-qsz8x\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294188 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/19ff5c79-1e07-4c43-8d35-bdf19869c72b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mdr5z\" (UID: \"19ff5c79-1e07-4c43-8d35-bdf19869c72b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294238 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2347a213-dd3e-4f1c-b36b-a8345bedb927-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294358 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973ba3b3-d07b-40ef-8419-40e19838e816-service-ca-bundle\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294381 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cd2da9c-112f-4043-af6f-a661a475cc2d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294404 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-config\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294879 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-trusted-ca-bundle\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.294990 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-config\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.295275 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cbdc9822-68a6-4bff-b373-cac82f25f4d3-service-ca\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.295675 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f85acbe7-b253-4d4a-847f-05845804f712-trusted-ca\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.295794 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e1ef8e-6fc5-491d-b658-b812cc556f67-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.295934 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bf3cce7-2248-44a8-a39d-54860c49fb5f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.296187 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-stats-auth\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.296529 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-serving-cert\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.296601 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-metrics-certs\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.296701 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.297000 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/44b352f9-ee16-45a4-9674-e954eeed9c6c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.297592 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cd2da9c-112f-4043-af6f-a661a475cc2d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.297869 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.298462 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-oauth-config\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.298494 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-serving-cert\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.298540 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbdc9822-68a6-4bff-b373-cac82f25f4d3-console-serving-cert\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.298940 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f85acbe7-b253-4d4a-847f-05845804f712-metrics-tls\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.299079 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2347a213-dd3e-4f1c-b36b-a8345bedb927-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.299886 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0e1ef8e-6fc5-491d-b658-b812cc556f67-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.299911 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa0691f5-c326-4370-a718-bc82bcfbdd78-metrics-tls\") pod \"dns-operator-744455d44c-q9h56\" (UID: \"aa0691f5-c326-4370-a718-bc82bcfbdd78\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.300340 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf3cce7-2248-44a8-a39d-54860c49fb5f-serving-cert\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.300371 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/973ba3b3-d07b-40ef-8419-40e19838e816-default-certificate\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.318174 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.323412 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-service-ca\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.337881 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.359097 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.379032 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.380865 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.380873 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.380876 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.384347 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-serving-cert\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395622 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7m4\" (UniqueName: \"kubernetes.io/projected/514859c2-bd3c-4ccb-90b0-61180a1bc297-kube-api-access-jt7m4\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395669 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-mountpoint-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395723 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395746 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-auth-proxy-config\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395784 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-node-bootstrap-token\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395807 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgfk\" (UniqueName: \"kubernetes.io/projected/6ff1b626-d50c-4608-be78-c27b787cc369-kube-api-access-rvgfk\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395836 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395876 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/849a6379-8100-4799-aa30-06c1359673b7-tmpfs\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395900 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-plugins-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395922 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zmp\" (UniqueName: \"kubernetes.io/projected/3e2746d8-5b75-43dc-80f4-81f69c38bb35-kube-api-access-65zmp\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395943 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-config\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.395972 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5b9dda0-da70-4e7c-850b-de8b7744a15c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396006 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-csi-data-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396031 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsz8x\" (UniqueName: \"kubernetes.io/projected/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-kube-api-access-qsz8x\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396073 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6s4p\" (UniqueName: \"kubernetes.io/projected/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-kube-api-access-q6s4p\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396070 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5b9dda0-da70-4e7c-850b-de8b7744a15c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396101 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-cert\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396107 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-csi-data-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396123 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e2746d8-5b75-43dc-80f4-81f69c38bb35-signing-key\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396147 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx24w\" (UniqueName: \"kubernetes.io/projected/a5b9dda0-da70-4e7c-850b-de8b7744a15c-kube-api-access-gx24w\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396197 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-webhook-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396252 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lskfl\" (UniqueName: \"kubernetes.io/projected/34a23190-6bba-4504-ada1-0724c8a4f1df-kube-api-access-lskfl\") pod \"package-server-manager-789f6589d5-t7pk6\" (UID: \"34a23190-6bba-4504-ada1-0724c8a4f1df\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396271 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-plugins-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396382 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-registration-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396450 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b47zt\" (UniqueName: \"kubernetes.io/projected/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-kube-api-access-b47zt\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396487 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-certs\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396502 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-registration-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396502 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/849a6379-8100-4799-aa30-06c1359673b7-tmpfs\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396542 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e2746d8-5b75-43dc-80f4-81f69c38bb35-signing-cabundle\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396577 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5d8594d0-33ac-470e-b2d5-65d3afaa625b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396616 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396652 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-machine-approver-tls\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396710 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqprh\" (UniqueName: \"kubernetes.io/projected/15dfbf5e-77ac-477c-89f7-b1e035a219c0-kube-api-access-rqprh\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396721 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-mountpoint-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396789 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60c525e-c2a9-4977-ae16-e2be015eab30-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396828 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-socket-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396877 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff1b626-d50c-4608-be78-c27b787cc369-proxy-tls\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396951 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-metrics-tls\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.396988 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhx5j\" (UniqueName: \"kubernetes.io/projected/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-kube-api-access-fhx5j\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397035 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a23190-6bba-4504-ada1-0724c8a4f1df-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t7pk6\" (UID: \"34a23190-6bba-4504-ada1-0724c8a4f1df\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397112 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvt9l\" (UniqueName: \"kubernetes.io/projected/849a6379-8100-4799-aa30-06c1359673b7-kube-api-access-zvt9l\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397162 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szm5d\" (UniqueName: \"kubernetes.io/projected/a60c525e-c2a9-4977-ae16-e2be015eab30-kube-api-access-szm5d\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397241 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397300 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a5b9dda0-da70-4e7c-850b-de8b7744a15c-ready\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397314 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-socket-dir\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397334 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nzc\" (UniqueName: \"kubernetes.io/projected/5d8594d0-33ac-470e-b2d5-65d3afaa625b-kube-api-access-v6nzc\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397392 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-secret-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397444 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5d8594d0-33ac-470e-b2d5-65d3afaa625b-srv-cert\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397492 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ff1b626-d50c-4608-be78-c27b787cc369-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397541 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6ff1b626-d50c-4608-be78-c27b787cc369-images\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397612 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-config-volume\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397644 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.397679 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4gq\" (UniqueName: \"kubernetes.io/projected/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-kube-api-access-cp4gq\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.398082 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ff1b626-d50c-4608-be78-c27b787cc369-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.398365 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.398702 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a5b9dda0-da70-4e7c-850b-de8b7744a15c-ready\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.409970 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-client\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.434509 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.439342 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.446353 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-config\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.462942 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.471351 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-etcd-ca\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.478418 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.497627 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.518058 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.525236 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a678474-b488-4785-a87c-70df116b33c9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.539247 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.542597 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a678474-b488-4785-a87c-70df116b33c9-config\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.579697 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.598747 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.618833 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.626417 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900e9596-8294-4c4d-857a-1b2bf9adaca7-serving-cert\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.639097 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.645589 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-config\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.658860 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.662706 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-client-ca\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.679421 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.698534 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.708431 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hvcsr\" (UID: \"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.718544 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.739309 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.758755 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.778617 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.798141 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.819429 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.838636 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.844772 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e77b7513-27ee-47a7-b39a-a11dd78a0500-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.859946 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.862132 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e77b7513-27ee-47a7-b39a-a11dd78a0500-config\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.879465 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.888741 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6ff1b626-d50c-4608-be78-c27b787cc369-images\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.899051 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.918070 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.932658 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff1b626-d50c-4608-be78-c27b787cc369-proxy-tls\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.938322 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.958706 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.977766 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 28 13:20:26 crc kubenswrapper[4970]: I1128 13:20:26.998981 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.018740 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.037810 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.058281 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.079856 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.098957 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.111795 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-secret-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.112365 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5d8594d0-33ac-470e-b2d5-65d3afaa625b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.116743 4970 request.go:700] Waited for 1.002160112s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.118735 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.138038 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.158760 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.178893 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.179953 4970 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.180082 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca podName:e80ce492-28d4-40cf-8a55-5a4f456e8255 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.680046035 +0000 UTC m=+38.532927875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca") pod "controller-manager-879f6c89f-cbvqk" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.182569 4970 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.182657 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert podName:e80ce492-28d4-40cf-8a55-5a4f456e8255 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.682633711 +0000 UTC m=+38.535515541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert") pod "controller-manager-879f6c89f-cbvqk" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.191941 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-machine-approver-tls\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.198809 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.208515 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-auth-proxy-config\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.218330 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.227481 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-config\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.238366 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.258778 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.279028 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.299451 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.311815 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e2746d8-5b75-43dc-80f4-81f69c38bb35-signing-key\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.318388 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.329122 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e2746d8-5b75-43dc-80f4-81f69c38bb35-signing-cabundle\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.338515 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.359805 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.374101 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a23190-6bba-4504-ada1-0724c8a4f1df-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t7pk6\" (UID: \"34a23190-6bba-4504-ada1-0724c8a4f1df\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.379988 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.382417 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.393152 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5d8594d0-33ac-470e-b2d5-65d3afaa625b-srv-cert\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.395894 4970 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396039 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca podName:514859c2-bd3c-4ccb-90b0-61180a1bc297 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.896010473 +0000 UTC m=+38.748892303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca") pod "marketplace-operator-79b997595-nccsb" (UID: "514859c2-bd3c-4ccb-90b0-61180a1bc297") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396105 4970 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396142 4970 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396204 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-apiservice-cert podName:849a6379-8100-4799-aa30-06c1359673b7 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.896176838 +0000 UTC m=+38.749058678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-apiservice-cert") pod "packageserver-d55dfcdfc-hv5px" (UID: "849a6379-8100-4799-aa30-06c1359673b7") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396260 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-node-bootstrap-token podName:15dfbf5e-77ac-477c-89f7-b1e035a219c0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.89624663 +0000 UTC m=+38.749128460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-node-bootstrap-token") pod "machine-config-server-l7c64" (UID: "15dfbf5e-77ac-477c-89f7-b1e035a219c0") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396300 4970 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396352 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-cert podName:a4f08974-1f20-4fe4-a63c-3c69b3064e4b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.896334813 +0000 UTC m=+38.749216643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-cert") pod "ingress-canary-kdkxk" (UID: "a4f08974-1f20-4fe4-a63c-3c69b3064e4b") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396395 4970 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.396449 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-webhook-cert podName:849a6379-8100-4799-aa30-06c1359673b7 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.896432666 +0000 UTC m=+38.749314656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-webhook-cert") pod "packageserver-d55dfcdfc-hv5px" (UID: "849a6379-8100-4799-aa30-06c1359673b7") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397005 4970 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397100 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-certs podName:15dfbf5e-77ac-477c-89f7-b1e035a219c0 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.897069575 +0000 UTC m=+38.749951425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-certs") pod "machine-config-server-l7c64" (UID: "15dfbf5e-77ac-477c-89f7-b1e035a219c0") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397272 4970 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397358 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist podName:a5b9dda0-da70-4e7c-850b-de8b7744a15c nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.897336392 +0000 UTC m=+38.750218292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-8btsr" (UID: "a5b9dda0-da70-4e7c-850b-de8b7744a15c") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397491 4970 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397577 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60c525e-c2a9-4977-ae16-e2be015eab30-webhook-certs podName:a60c525e-c2a9-4977-ae16-e2be015eab30 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.897550629 +0000 UTC m=+38.750432539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a60c525e-c2a9-4977-ae16-e2be015eab30-webhook-certs") pod "multus-admission-controller-857f4d67dd-pldh4" (UID: "a60c525e-c2a9-4977-ae16-e2be015eab30") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397618 4970 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397675 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-metrics-tls podName:dc2cf336-9480-4170-b13a-1f9b0d3cbcba nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.897658312 +0000 UTC m=+38.750540282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-metrics-tls") pod "dns-default-p47s7" (UID: "dc2cf336-9480-4170-b13a-1f9b0d3cbcba") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397725 4970 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397809 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics podName:514859c2-bd3c-4ccb-90b0-61180a1bc297 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.897786226 +0000 UTC m=+38.750668136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics") pod "marketplace-operator-79b997595-nccsb" (UID: "514859c2-bd3c-4ccb-90b0-61180a1bc297") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397880 4970 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.397951 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-config-volume podName:dc2cf336-9480-4170-b13a-1f9b0d3cbcba nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.89792944 +0000 UTC m=+38.750811390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-config-volume") pod "dns-default-p47s7" (UID: "dc2cf336-9480-4170-b13a-1f9b0d3cbcba") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.398010 4970 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: E1128 13:20:27.398097 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume podName:78210a9d-d2ee-4d21-a0e5-956cb8fd85d2 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:27.898075334 +0000 UTC m=+38.750957254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume") pod "collect-profiles-29405595-bgxjt" (UID: "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.401141 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.421604 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.445783 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.459344 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.478937 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.498393 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.519816 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.538293 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.558246 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.580437 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.607514 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.617934 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.639336 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.658235 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.678278 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.699364 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.718746 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.723564 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.723636 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.739402 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.759035 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.778955 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.799039 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.818435 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.839416 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.858852 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.878740 4970 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943183 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60c525e-c2a9-4977-ae16-e2be015eab30-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943286 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-metrics-tls\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943356 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943413 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-config-volume\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943435 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943523 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943554 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-node-bootstrap-token\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943577 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943667 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-cert\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943728 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-webhook-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943772 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-certs\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.943810 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.944684 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.945253 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-config-volume\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.945421 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.946115 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.947775 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-metrics-tls\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.947797 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a60c525e-c2a9-4977-ae16-e2be015eab30-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.948637 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njb4z\" (UniqueName: \"kubernetes.io/projected/3c65c694-b05d-40db-a754-9b530aadc7a7-kube-api-access-njb4z\") pod \"apiserver-7bbb656c7d-v5kwk\" (UID: \"3c65c694-b05d-40db-a754-9b530aadc7a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.950571 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-webhook-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.950631 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/849a6379-8100-4799-aa30-06c1359673b7-apiservice-cert\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.951013 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.951366 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-certs\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.951896 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15dfbf5e-77ac-477c-89f7-b1e035a219c0-node-bootstrap-token\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.961520 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qhd\" (UniqueName: \"kubernetes.io/projected/e1b7817f-3229-472a-b433-c2173e7abf6c-kube-api-access-79qhd\") pod \"openshift-apiserver-operator-796bbdcf4f-6rjdm\" (UID: \"e1b7817f-3229-472a-b433-c2173e7abf6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.963118 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tmnn\" (UniqueName: \"kubernetes.io/projected/65101460-48b8-4bd6-82b0-4f5bd4254ec5-kube-api-access-7tmnn\") pod \"oauth-openshift-558db77b4-cdzrp\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:27 crc kubenswrapper[4970]: I1128 13:20:27.974825 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vglkf\" (UniqueName: \"kubernetes.io/projected/5998853c-3fbb-403e-b222-5a5c939dbb58-kube-api-access-vglkf\") pod \"machine-api-operator-5694c8668f-wms6k\" (UID: \"5998853c-3fbb-403e-b222-5a5c939dbb58\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.004976 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pqp\" (UniqueName: \"kubernetes.io/projected/e80ce492-28d4-40cf-8a55-5a4f456e8255-kube-api-access-45pqp\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.017286 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.018274 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdc8\" (UniqueName: \"kubernetes.io/projected/336437f6-aba2-46ae-bf5f-2555d2db13fb-kube-api-access-jwdc8\") pod \"apiserver-76f77b778f-78pq5\" (UID: \"336437f6-aba2-46ae-bf5f-2555d2db13fb\") " pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.037983 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.048252 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-cert\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.048784 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.058166 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.066867 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.077875 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.117288 4970 request.go:700] Waited for 1.827061567s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.138577 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44b352f9-ee16-45a4-9674-e954eeed9c6c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.155890 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hnf9\" (UniqueName: \"kubernetes.io/projected/8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b-kube-api-access-9hnf9\") pod \"etcd-operator-b45778765-xqmqh\" (UID: \"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.174275 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.175960 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmv6\" (UniqueName: \"kubernetes.io/projected/f85acbe7-b253-4d4a-847f-05845804f712-kube-api-access-5qmv6\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.187567 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.195922 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.197674 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzjd\" (UniqueName: \"kubernetes.io/projected/2347a213-dd3e-4f1c-b36b-a8345bedb927-kube-api-access-rtzjd\") pod \"kube-storage-version-migrator-operator-b67b599dd-86qzl\" (UID: \"2347a213-dd3e-4f1c-b36b-a8345bedb927\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.220418 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsklx\" (UniqueName: \"kubernetes.io/projected/b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894-kube-api-access-hsklx\") pod \"control-plane-machine-set-operator-78cbb6b69f-hvcsr\" (UID: \"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.236911 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnh6w\" (UniqueName: \"kubernetes.io/projected/79187155-9c7e-48a9-a3f8-3bcf8d921be6-kube-api-access-qnh6w\") pod \"downloads-7954f5f757-9chbz\" (UID: \"79187155-9c7e-48a9-a3f8-3bcf8d921be6\") " pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.264434 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cd2da9c-112f-4043-af6f-a661a475cc2d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cq28p\" (UID: \"9cd2da9c-112f-4043-af6f-a661a475cc2d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.285480 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchp7\" (UniqueName: \"kubernetes.io/projected/973ba3b3-d07b-40ef-8419-40e19838e816-kube-api-access-rchp7\") pod \"router-default-5444994796-6tn57\" (UID: \"973ba3b3-d07b-40ef-8419-40e19838e816\") " pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.304573 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hsc\" (UniqueName: \"kubernetes.io/projected/3bf3cce7-2248-44a8-a39d-54860c49fb5f-kube-api-access-94hsc\") pod \"authentication-operator-69f744f599-xzv2b\" (UID: \"3bf3cce7-2248-44a8-a39d-54860c49fb5f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.312116 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.315177 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkxm\" (UniqueName: \"kubernetes.io/projected/44b352f9-ee16-45a4-9674-e954eeed9c6c-kube-api-access-zqkxm\") pod \"cluster-image-registry-operator-dc59b4c8b-4s4w9\" (UID: \"44b352f9-ee16-45a4-9674-e954eeed9c6c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.317977 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.342440 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.345297 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc558\" (UniqueName: \"kubernetes.io/projected/1c43e47b-7ccb-41ec-8f8f-08b159bb15f3-kube-api-access-dc558\") pod \"console-operator-58897d9998-pzs29\" (UID: \"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3\") " pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.363175 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.363829 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6tp\" (UniqueName: \"kubernetes.io/projected/b0e1ef8e-6fc5-491d-b658-b812cc556f67-kube-api-access-wg6tp\") pod \"openshift-controller-manager-operator-756b6f6bc6-dhdzs\" (UID: \"b0e1ef8e-6fc5-491d-b658-b812cc556f67\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.380917 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e77b7513-27ee-47a7-b39a-a11dd78a0500-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwp64\" (UID: \"e77b7513-27ee-47a7-b39a-a11dd78a0500\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.403938 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz5k5\" (UniqueName: \"kubernetes.io/projected/0c874537-15c1-4f94-b6cd-086c9c1762f0-kube-api-access-fz5k5\") pod \"migrator-59844c95c7-xbnhw\" (UID: \"0c874537-15c1-4f94-b6cd-086c9c1762f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.429444 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a678474-b488-4785-a87c-70df116b33c9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ts64w\" (UID: \"6a678474-b488-4785-a87c-70df116b33c9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.440335 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpv9z\" (UniqueName: \"kubernetes.io/projected/900e9596-8294-4c4d-857a-1b2bf9adaca7-kube-api-access-wpv9z\") pod \"route-controller-manager-6576b87f9c-znlgk\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.453682 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.464235 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f85acbe7-b253-4d4a-847f-05845804f712-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rzrx9\" (UID: \"f85acbe7-b253-4d4a-847f-05845804f712\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.484824 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr8gn\" (UniqueName: \"kubernetes.io/projected/19ff5c79-1e07-4c43-8d35-bdf19869c72b-kube-api-access-rr8gn\") pod \"cluster-samples-operator-665b6dd947-mdr5z\" (UID: \"19ff5c79-1e07-4c43-8d35-bdf19869c72b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.504823 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpp5l\" (UniqueName: \"kubernetes.io/projected/5a56c71c-0c49-4a0a-aee0-2a1ef5936574-kube-api-access-qpp5l\") pod \"openshift-config-operator-7777fb866f-7nxhr\" (UID: \"5a56c71c-0c49-4a0a-aee0-2a1ef5936574\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.512760 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.521682 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.534469 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.538678 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.544898 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.546025 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqv54\" (UniqueName: \"kubernetes.io/projected/aa0691f5-c326-4370-a718-bc82bcfbdd78-kube-api-access-pqv54\") pod \"dns-operator-744455d44c-q9h56\" (UID: \"aa0691f5-c326-4370-a718-bc82bcfbdd78\") " pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.552324 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.559333 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.564788 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.567766 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.579409 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.600058 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.624541 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.625562 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v7cs\" (UniqueName: \"kubernetes.io/projected/cbdc9822-68a6-4bff-b373-cac82f25f4d3-kube-api-access-5v7cs\") pod \"console-f9d7485db-kwnx5\" (UID: \"cbdc9822-68a6-4bff-b373-cac82f25f4d3\") " pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.631719 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.647977 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.654700 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.655106 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7m4\" (UniqueName: \"kubernetes.io/projected/514859c2-bd3c-4ccb-90b0-61180a1bc297-kube-api-access-jt7m4\") pod \"marketplace-operator-79b997595-nccsb\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.673338 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgfk\" (UniqueName: \"kubernetes.io/projected/6ff1b626-d50c-4608-be78-c27b787cc369-kube-api-access-rvgfk\") pod \"machine-config-operator-74547568cd-4rqzq\" (UID: \"6ff1b626-d50c-4608-be78-c27b787cc369\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.679170 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.679767 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.689953 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.694125 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zmp\" (UniqueName: \"kubernetes.io/projected/3e2746d8-5b75-43dc-80f4-81f69c38bb35-kube-api-access-65zmp\") pod \"service-ca-9c57cc56f-kfmkb\" (UID: \"3e2746d8-5b75-43dc-80f4-81f69c38bb35\") " pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.704145 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6s4p\" (UniqueName: \"kubernetes.io/projected/e2bc2c5d-22da-4436-b7bc-0924d7f275f2-kube-api-access-q6s4p\") pod \"machine-approver-56656f9798-m8rgw\" (UID: \"e2bc2c5d-22da-4436-b7bc-0924d7f275f2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:28 crc kubenswrapper[4970]: E1128 13:20:28.724548 4970 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:28 crc kubenswrapper[4970]: E1128 13:20:28.725419 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert podName:e80ce492-28d4-40cf-8a55-5a4f456e8255 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.725391104 +0000 UTC m=+40.578272904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert") pod "controller-manager-879f6c89f-cbvqk" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255") : failed to sync secret cache: timed out waiting for the condition Nov 28 13:20:28 crc kubenswrapper[4970]: E1128 13:20:28.724708 4970 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:28 crc kubenswrapper[4970]: E1128 13:20:28.725523 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca podName:e80ce492-28d4-40cf-8a55-5a4f456e8255 nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.725498617 +0000 UTC m=+40.578380417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca") pod "controller-manager-879f6c89f-cbvqk" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255") : failed to sync configmap cache: timed out waiting for the condition Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.741531 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.747679 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.759751 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsz8x\" (UniqueName: \"kubernetes.io/projected/dc2cf336-9480-4170-b13a-1f9b0d3cbcba-kube-api-access-qsz8x\") pod \"dns-default-p47s7\" (UID: \"dc2cf336-9480-4170-b13a-1f9b0d3cbcba\") " pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.767171 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx24w\" (UniqueName: \"kubernetes.io/projected/a5b9dda0-da70-4e7c-850b-de8b7744a15c-kube-api-access-gx24w\") pod \"cni-sysctl-allowlist-ds-8btsr\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.767466 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lskfl\" (UniqueName: \"kubernetes.io/projected/34a23190-6bba-4504-ada1-0724c8a4f1df-kube-api-access-lskfl\") pod \"package-server-manager-789f6589d5-t7pk6\" (UID: \"34a23190-6bba-4504-ada1-0724c8a4f1df\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.774131 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqprh\" (UniqueName: \"kubernetes.io/projected/15dfbf5e-77ac-477c-89f7-b1e035a219c0-kube-api-access-rqprh\") pod \"machine-config-server-l7c64\" (UID: \"15dfbf5e-77ac-477c-89f7-b1e035a219c0\") " pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.778240 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.793594 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.797388 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvt9l\" (UniqueName: \"kubernetes.io/projected/849a6379-8100-4799-aa30-06c1359673b7-kube-api-access-zvt9l\") pod \"packageserver-d55dfcdfc-hv5px\" (UID: \"849a6379-8100-4799-aa30-06c1359673b7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.807125 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l7c64" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.811445 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b47zt\" (UniqueName: \"kubernetes.io/projected/a4f08974-1f20-4fe4-a63c-3c69b3064e4b-kube-api-access-b47zt\") pod \"ingress-canary-kdkxk\" (UID: \"a4f08974-1f20-4fe4-a63c-3c69b3064e4b\") " pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.825013 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.827187 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.834360 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhx5j\" (UniqueName: \"kubernetes.io/projected/24797c3c-a0a6-4bab-b61e-dcc1aaedcccb-kube-api-access-fhx5j\") pod \"csi-hostpathplugin-2qght\" (UID: \"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb\") " pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.858423 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szm5d\" (UniqueName: \"kubernetes.io/projected/a60c525e-c2a9-4977-ae16-e2be015eab30-kube-api-access-szm5d\") pod \"multus-admission-controller-857f4d67dd-pldh4\" (UID: \"a60c525e-c2a9-4977-ae16-e2be015eab30\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.863550 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2qght" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.873845 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kdkxk" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.885870 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nzc\" (UniqueName: \"kubernetes.io/projected/5d8594d0-33ac-470e-b2d5-65d3afaa625b-kube-api-access-v6nzc\") pod \"olm-operator-6b444d44fb-gdzxc\" (UID: \"5d8594d0-33ac-470e-b2d5-65d3afaa625b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.895597 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-78pq5"] Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.904850 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4gq\" (UniqueName: \"kubernetes.io/projected/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-kube-api-access-cp4gq\") pod \"collect-profiles-29405595-bgxjt\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.918940 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.928815 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wms6k"] Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.939691 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.958765 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 28 13:20:28 crc kubenswrapper[4970]: I1128 13:20:28.978112 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.001254 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.025461 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdzrp"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.046675 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.055704 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.060755 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-registry-tls\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.060865 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77028134-907f-445b-8470-d961a46beea4-proxy-tls\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.060914 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-bound-sa-token\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061019 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061152 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309db78c-54d0-452b-8b62-979217816260-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061183 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-profile-collector-cert\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061204 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrkj\" (UniqueName: \"kubernetes.io/projected/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-kube-api-access-qxrkj\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061258 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnc7\" (UniqueName: \"kubernetes.io/projected/77028134-907f-445b-8470-d961a46beea4-kube-api-access-4wnc7\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061282 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fntj\" (UniqueName: \"kubernetes.io/projected/23eb3763-e0a8-4069-baa1-79134daba66e-kube-api-access-7fntj\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061303 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-srv-cert\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061333 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb3763-e0a8-4069-baa1-79134daba66e-config\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061358 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6sv2\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-kube-api-access-k6sv2\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061527 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77028134-907f-445b-8470-d961a46beea4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061637 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-trusted-ca\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061655 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-registry-certificates\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061677 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309db78c-54d0-452b-8b62-979217816260-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.061695 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb3763-e0a8-4069-baa1-79134daba66e-serving-cert\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.063090 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.563074571 +0000 UTC m=+40.415956451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.065056 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.072032 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.076959 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.086329 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.100653 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.146879 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.162634 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.163032 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-bound-sa-token\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.163358 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309db78c-54d0-452b-8b62-979217816260-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.163531 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-profile-collector-cert\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.163641 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrkj\" (UniqueName: \"kubernetes.io/projected/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-kube-api-access-qxrkj\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.163762 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnc7\" (UniqueName: \"kubernetes.io/projected/77028134-907f-445b-8470-d961a46beea4-kube-api-access-4wnc7\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.165079 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb3763-e0a8-4069-baa1-79134daba66e-config\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.165175 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fntj\" (UniqueName: \"kubernetes.io/projected/23eb3763-e0a8-4069-baa1-79134daba66e-kube-api-access-7fntj\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.165286 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-srv-cert\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.165387 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6sv2\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-kube-api-access-k6sv2\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.165641 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77028134-907f-445b-8470-d961a46beea4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.166451 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77028134-907f-445b-8470-d961a46beea4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.166896 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-trusted-ca\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.167365 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309db78c-54d0-452b-8b62-979217816260-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.167469 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-registry-certificates\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.167585 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb3763-e0a8-4069-baa1-79134daba66e-serving-cert\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.167780 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-registry-tls\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.167959 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77028134-907f-445b-8470-d961a46beea4-proxy-tls\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.169154 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.669124828 +0000 UTC m=+40.522006628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.172811 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309db78c-54d0-452b-8b62-979217816260-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.172970 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309db78c-54d0-452b-8b62-979217816260-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.174448 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-profile-collector-cert\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.168289 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-trusted-ca\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.179902 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-registry-tls\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.180899 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23eb3763-e0a8-4069-baa1-79134daba66e-config\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.182860 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-registry-certificates\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.186236 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.191143 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23eb3763-e0a8-4069-baa1-79134daba66e-serving-cert\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.202972 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77028134-907f-445b-8470-d961a46beea4-proxy-tls\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.203644 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-srv-cert\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.222667 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-bound-sa-token\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.235910 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrkj\" (UniqueName: \"kubernetes.io/projected/05583a8a-e618-4f75-9ce1-b7ada5fc1ae7-kube-api-access-qxrkj\") pod \"catalog-operator-68c6474976-rjnn5\" (UID: \"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.265168 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnc7\" (UniqueName: \"kubernetes.io/projected/77028134-907f-445b-8470-d961a46beea4-kube-api-access-4wnc7\") pod \"machine-config-controller-84d6567774-w4m45\" (UID: \"77028134-907f-445b-8470-d961a46beea4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.270308 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.270671 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.770658032 +0000 UTC m=+40.623539832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.285069 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fntj\" (UniqueName: \"kubernetes.io/projected/23eb3763-e0a8-4069-baa1-79134daba66e-kube-api-access-7fntj\") pod \"service-ca-operator-777779d784-7bxck\" (UID: \"23eb3763-e0a8-4069-baa1-79134daba66e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.292496 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6sv2\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-kube-api-access-k6sv2\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.326555 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.331976 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.371374 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.371731 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.871700182 +0000 UTC m=+40.724581982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.416293 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9chbz"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.417140 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.476707 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.477046 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:29.977030158 +0000 UTC m=+40.829911958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.578599 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.578714 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.078696466 +0000 UTC m=+40.931578266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.578889 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.579205 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.07919576 +0000 UTC m=+40.932077570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.679964 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.680145 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.180117256 +0000 UTC m=+41.032999056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.680472 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.680839 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.180825357 +0000 UTC m=+41.033707157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.706225 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" event={"ID":"a5b9dda0-da70-4e7c-850b-de8b7744a15c","Type":"ContainerStarted","Data":"b44cdf5cbf30e32199221f203c56e2b950666cc11a4487a6c1008683a372cb2a"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.707180 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" event={"ID":"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894","Type":"ContainerStarted","Data":"b220cb1f074eb3880bf7e3a9413f5bc6c9524ef7c301eb6e5afc53385e76a886"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.708525 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" event={"ID":"5998853c-3fbb-403e-b222-5a5c939dbb58","Type":"ContainerStarted","Data":"cdcff78397c9bcbe9d3c311297da18c63fda0ce6684b0cc44b158c8c021bc97f"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.709447 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" event={"ID":"e2bc2c5d-22da-4436-b7bc-0924d7f275f2","Type":"ContainerStarted","Data":"bf16edc6d27fc3424d3b7305e4d96f04a1a0db2ecebf7665685f4abef7a4e66c"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.710242 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9chbz" event={"ID":"79187155-9c7e-48a9-a3f8-3bcf8d921be6","Type":"ContainerStarted","Data":"7796b43710733240a6c951698b88fc4071e2a82065240ac617940512cb342887"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.710912 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" event={"ID":"336437f6-aba2-46ae-bf5f-2555d2db13fb","Type":"ContainerStarted","Data":"71a678227fa64374738ce7220e73e097414791a3c4fa0f5e7a6ff1b344b1200e"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.711800 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" event={"ID":"65101460-48b8-4bd6-82b0-4f5bd4254ec5","Type":"ContainerStarted","Data":"f71cce2f30392cf6a3cb448ff268b7246b2baea435976519788ed6b8aac29e78"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.712802 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" event={"ID":"e1b7817f-3229-472a-b433-c2173e7abf6c","Type":"ContainerStarted","Data":"1c457fde676e97ce2578cfb06e618ee3131a9c18eedfd74bd95786bd27978717"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.713373 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l7c64" event={"ID":"15dfbf5e-77ac-477c-89f7-b1e035a219c0","Type":"ContainerStarted","Data":"3171028e03ffbca590278769729a04b0066b508debd5b16521e365c40bb2addd"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.713883 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" event={"ID":"3c65c694-b05d-40db-a754-9b530aadc7a7","Type":"ContainerStarted","Data":"8aba39a92201114606b717728586dd54829abc21c458b720caaf11985172ec93"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.714565 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6tn57" event={"ID":"973ba3b3-d07b-40ef-8419-40e19838e816","Type":"ContainerStarted","Data":"96494c56a27f1b8a1d188f44ee10daa5b96c48b1decebc18c504749dca4af4b6"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.715200 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" event={"ID":"b0e1ef8e-6fc5-491d-b658-b812cc556f67","Type":"ContainerStarted","Data":"1423d36be9297412f4961a8b89beae1580c1ce528265841406ec51036b4ea89b"} Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.782138 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.782422 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.782485 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.782696 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.282671 +0000 UTC m=+41.135552840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.783398 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.804906 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") pod \"controller-manager-879f6c89f-cbvqk\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.887005 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.887126 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.888360 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.388345286 +0000 UTC m=+41.241227086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.894570 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0-metrics-certs\") pod \"network-metrics-daemon-4vr87\" (UID: \"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0\") " pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.897135 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.905672 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q9h56"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.905721 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.926257 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.939779 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9"] Nov 28 13:20:29 crc kubenswrapper[4970]: I1128 13:20:29.989737 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:29 crc kubenswrapper[4970]: E1128 13:20:29.990291 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.490277322 +0000 UTC m=+41.343159122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.060313 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pzs29"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.062359 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xqmqh"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.069881 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.088312 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xzv2b"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.092164 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.092892 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.592877398 +0000 UTC m=+41.445759208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.132698 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vr87" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.159471 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.164597 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kfmkb"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.167486 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.178933 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.178979 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.179131 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.181771 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kwnx5"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.188315 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.188361 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.190369 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kdkxk"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.192989 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.193324 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.693307259 +0000 UTC m=+41.546189059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.194306 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p47s7"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.195973 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.197078 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.197729 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nccsb"] Nov 28 13:20:30 crc kubenswrapper[4970]: W1128 13:20:30.235998 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2746d8_5b75_43dc_80f4_81f69c38bb35.slice/crio-77f678c7e9a59acaecd6086762673eefe658fb0943a27a251a53b2ca6a399800 WatchSource:0}: Error finding container 77f678c7e9a59acaecd6086762673eefe658fb0943a27a251a53b2ca6a399800: Status 404 returned error can't find the container with id 77f678c7e9a59acaecd6086762673eefe658fb0943a27a251a53b2ca6a399800 Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.295460 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.295837 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.795826632 +0000 UTC m=+41.648708432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.336601 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt"] Nov 28 13:20:30 crc kubenswrapper[4970]: W1128 13:20:30.350109 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a23190_6bba_4504_ada1_0724c8a4f1df.slice/crio-a70795235818636706b4f02ce7e0b78b0440af8d267a5588118720cf2ff8dcbd WatchSource:0}: Error finding container a70795235818636706b4f02ce7e0b78b0440af8d267a5588118720cf2ff8dcbd: Status 404 returned error can't find the container with id a70795235818636706b4f02ce7e0b78b0440af8d267a5588118720cf2ff8dcbd Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.399287 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.400294 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:30.900267522 +0000 UTC m=+41.753149322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.400847 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5"] Nov 28 13:20:30 crc kubenswrapper[4970]: W1128 13:20:30.445374 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a56c71c_0c49_4a0a_aee0_2a1ef5936574.slice/crio-8bba620d67dcbee5caba68f99e057cb91d66441e6ec9520b943653e2bbd71e45 WatchSource:0}: Error finding container 8bba620d67dcbee5caba68f99e057cb91d66441e6ec9520b943653e2bbd71e45: Status 404 returned error can't find the container with id 8bba620d67dcbee5caba68f99e057cb91d66441e6ec9520b943653e2bbd71e45 Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.506026 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7bxck"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.530517 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.530844 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.030832812 +0000 UTC m=+41.883714602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.540919 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pldh4"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.601875 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cbvqk"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.622338 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vr87"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.639025 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.640053 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.140037772 +0000 UTC m=+41.992919572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: W1128 13:20:30.673665 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60c525e_c2a9_4977_ae16_e2be015eab30.slice/crio-be1be94dc25cce2afe2a995cac2229d08f20ade4755685cda4544122a3de0634 WatchSource:0}: Error finding container be1be94dc25cce2afe2a995cac2229d08f20ade4755685cda4544122a3de0634: Status 404 returned error can't find the container with id be1be94dc25cce2afe2a995cac2229d08f20ade4755685cda4544122a3de0634 Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.679681 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.695587 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45"] Nov 28 13:20:30 crc kubenswrapper[4970]: W1128 13:20:30.714059 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc76cfdfa_c0f1_4f41_8f5b_83161f7f8bf0.slice/crio-a623920f62fa43fcd8c8f07420fad8df5e26d11f0ecbd26b32168249a61ea8d8 WatchSource:0}: Error finding container a623920f62fa43fcd8c8f07420fad8df5e26d11f0ecbd26b32168249a61ea8d8: Status 404 returned error can't find the container with id a623920f62fa43fcd8c8f07420fad8df5e26d11f0ecbd26b32168249a61ea8d8 Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.738428 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2qght"] Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.740685 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.741109 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.241093832 +0000 UTC m=+42.093975632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.746585 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" event={"ID":"5d8594d0-33ac-470e-b2d5-65d3afaa625b","Type":"ContainerStarted","Data":"de9d6f41898fd957b5d7ee232d57b3ecaa04cdfae7c5040caef2014de629597d"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.749881 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" event={"ID":"9cd2da9c-112f-4043-af6f-a661a475cc2d","Type":"ContainerStarted","Data":"bf2366918b32bf5d721052a1352fd312679bfae0d865916387416d6238a58c10"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.749929 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" event={"ID":"9cd2da9c-112f-4043-af6f-a661a475cc2d","Type":"ContainerStarted","Data":"c39276b8531c34d8a26bc7d0e665002a7b5f6bdbaca75696e04323b328c7fba5"} Nov 28 13:20:30 crc kubenswrapper[4970]: W1128 13:20:30.751784 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849a6379_8100_4799_aa30_06c1359673b7.slice/crio-016ad987958959d40cea43b0d39093850e4794793e6dfa9cd8550c1d59649331 WatchSource:0}: Error finding container 016ad987958959d40cea43b0d39093850e4794793e6dfa9cd8550c1d59649331: Status 404 returned error can't find the container with id 016ad987958959d40cea43b0d39093850e4794793e6dfa9cd8550c1d59649331 Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.754436 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l7c64" event={"ID":"15dfbf5e-77ac-477c-89f7-b1e035a219c0","Type":"ContainerStarted","Data":"c6df9986e313cf1d811638c2218e22b2989d7da2fe4920a0b99eabfbdf5b092b"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.755992 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" event={"ID":"aa0691f5-c326-4370-a718-bc82bcfbdd78","Type":"ContainerStarted","Data":"8f52c56861167ad6a21832099697263489bfe6b8871ece1927fd7fb8f165806a"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.756034 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" event={"ID":"aa0691f5-c326-4370-a718-bc82bcfbdd78","Type":"ContainerStarted","Data":"2273f41dceb0b419c97dc3dfeb09cd0dfb075f4ddb1fb77b74d0a473c99cfe1b"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.757495 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" event={"ID":"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7","Type":"ContainerStarted","Data":"8f2cf989e5ef683607539882bb0e765cbb4e1bca4f53b893bb1eb35bff39f92c"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.764727 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" event={"ID":"6ff1b626-d50c-4608-be78-c27b787cc369","Type":"ContainerStarted","Data":"221c4cf857c71f8724a507138417356393127d1d046675f1fff12eebac67199b"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.771887 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" event={"ID":"19ff5c79-1e07-4c43-8d35-bdf19869c72b","Type":"ContainerStarted","Data":"36a27dccadee3142291e0e366d12d1789eb862d7b41289777324e50b8838d239"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.778688 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" event={"ID":"a5b9dda0-da70-4e7c-850b-de8b7744a15c","Type":"ContainerStarted","Data":"2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.779366 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.783759 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" event={"ID":"23eb3763-e0a8-4069-baa1-79134daba66e","Type":"ContainerStarted","Data":"94edba765d3c0d746ecf861ff8e6264456d73ba75af5409e429ee3c98bb5a6b3"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.826190 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" event={"ID":"65101460-48b8-4bd6-82b0-4f5bd4254ec5","Type":"ContainerStarted","Data":"9c6804f75305b0a57406da7538db40ab4eb23ecbedfba98d8420b87f8bd0bff0"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.828773 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.841652 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.841946 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.341925546 +0000 UTC m=+42.194807346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.869128 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" event={"ID":"44b352f9-ee16-45a4-9674-e954eeed9c6c","Type":"ContainerStarted","Data":"03d8ecf38aa170757de0efef5fe680c2127312bf76cf2633f9de9e74c4e22f1f"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.869166 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" event={"ID":"44b352f9-ee16-45a4-9674-e954eeed9c6c","Type":"ContainerStarted","Data":"b61f445c8d889b12c4b5a48249af944eaea0bb6063b647db3f069add8f3633c4"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.886988 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" event={"ID":"514859c2-bd3c-4ccb-90b0-61180a1bc297","Type":"ContainerStarted","Data":"c47d3ee97917003dda9f2a5b183cf5f436c96b979077fc232c4e861fa33d4ef8"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.906446 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" event={"ID":"e1b7817f-3229-472a-b433-c2173e7abf6c","Type":"ContainerStarted","Data":"4b653ba2d0f56b0b1daaa9f197dce3e1f0b96c2aaceaa6e90375c71dd77adffb"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.920425 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" event={"ID":"e80ce492-28d4-40cf-8a55-5a4f456e8255","Type":"ContainerStarted","Data":"9b529263a5dbbe3335d70b89eaa6aa73285801afa02ebf6f982dadc80b9b751a"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.922645 4970 generic.go:334] "Generic (PLEG): container finished" podID="3c65c694-b05d-40db-a754-9b530aadc7a7" containerID="20f5ad1ae3873a01216e7f581830995d89d00dff1a967c1762bdcb271cd59ff5" exitCode=0 Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.922703 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" event={"ID":"3c65c694-b05d-40db-a754-9b530aadc7a7","Type":"ContainerDied","Data":"20f5ad1ae3873a01216e7f581830995d89d00dff1a967c1762bdcb271cd59ff5"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.939584 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" event={"ID":"a60c525e-c2a9-4977-ae16-e2be015eab30","Type":"ContainerStarted","Data":"be1be94dc25cce2afe2a995cac2229d08f20ade4755685cda4544122a3de0634"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.941516 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9chbz" event={"ID":"79187155-9c7e-48a9-a3f8-3bcf8d921be6","Type":"ContainerStarted","Data":"82c541362eea65958ea754653e2f8a08679ab5fcee0e7bcd9735d7344c75cd02"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.941937 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.942755 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pzs29" event={"ID":"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3","Type":"ContainerStarted","Data":"8324ffe74b597f199efc1fe9f5bf203a539b1b36fad9676a04c190b214d63a6b"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.942778 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pzs29" event={"ID":"1c43e47b-7ccb-41ec-8f8f-08b159bb15f3","Type":"ContainerStarted","Data":"d650e88f118819c318036cf57a979741f80ba639905a30bd08dbbb0d34c8b0d8"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.943124 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.943159 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:30 crc kubenswrapper[4970]: E1128 13:20:30.946437 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.446421147 +0000 UTC m=+42.299302947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.947139 4970 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chbz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.947165 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9chbz" podUID="79187155-9c7e-48a9-a3f8-3bcf8d921be6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.948178 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-l7c64" podStartSLOduration=5.948161358 podStartE2EDuration="5.948161358s" podCreationTimestamp="2025-11-28 13:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:30.946124448 +0000 UTC m=+41.799006248" watchObservedRunningTime="2025-11-28 13:20:30.948161358 +0000 UTC m=+41.801043158" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.956263 4970 patch_prober.go:28] interesting pod/console-operator-58897d9998-pzs29 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.956320 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pzs29" podUID="1c43e47b-7ccb-41ec-8f8f-08b159bb15f3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.963409 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" event={"ID":"3bf3cce7-2248-44a8-a39d-54860c49fb5f","Type":"ContainerStarted","Data":"172ffa971384b386f1e9b50fce6b7a1f51dc6bd38140d6d19aacc9a455d8c913"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.963466 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" event={"ID":"3bf3cce7-2248-44a8-a39d-54860c49fb5f","Type":"ContainerStarted","Data":"bcb73a818394901796b15845836cab37f880c9201e14243f6fb63116e5fa0779"} Nov 28 13:20:30 crc kubenswrapper[4970]: I1128 13:20:30.994780 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" event={"ID":"b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894","Type":"ContainerStarted","Data":"a9757a2ea577b42297536131ca3f9d4cb013aacf27edb27e5fe52959bd5ad720"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.031073 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4s4w9" podStartSLOduration=18.031055743 podStartE2EDuration="18.031055743s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.013255508 +0000 UTC m=+41.866137308" watchObservedRunningTime="2025-11-28 13:20:31.031055743 +0000 UTC m=+41.883937533" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.048492 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.050264 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.550244568 +0000 UTC m=+42.403126368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.082306 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" podStartSLOduration=18.082281643 podStartE2EDuration="18.082281643s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.068648391 +0000 UTC m=+41.921530191" watchObservedRunningTime="2025-11-28 13:20:31.082281643 +0000 UTC m=+41.935163443" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.090932 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" event={"ID":"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2","Type":"ContainerStarted","Data":"c169f5c9f508bf86308ea14824e6e46b1c74e2ec94d26067c6007aa80b1b5494"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.096101 4970 generic.go:334] "Generic (PLEG): container finished" podID="336437f6-aba2-46ae-bf5f-2555d2db13fb" containerID="32b41eb563696725905cd8fcda3990a32dade80a90dcd41dd16c2768669c1736" exitCode=0 Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.096404 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" event={"ID":"336437f6-aba2-46ae-bf5f-2555d2db13fb","Type":"ContainerDied","Data":"32b41eb563696725905cd8fcda3990a32dade80a90dcd41dd16c2768669c1736"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.097470 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" event={"ID":"34a23190-6bba-4504-ada1-0724c8a4f1df","Type":"ContainerStarted","Data":"dab93f5e54c9edc617d7cbcb0450a5359b6143357b25c1ce824d46626f935dd3"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.097489 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" event={"ID":"34a23190-6bba-4504-ada1-0724c8a4f1df","Type":"ContainerStarted","Data":"a70795235818636706b4f02ce7e0b78b0440af8d267a5588118720cf2ff8dcbd"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.126885 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cq28p" podStartSLOduration=17.126867288 podStartE2EDuration="17.126867288s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.10827075 +0000 UTC m=+41.961152550" watchObservedRunningTime="2025-11-28 13:20:31.126867288 +0000 UTC m=+41.979749078" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.127481 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podStartSLOduration=6.127477376 podStartE2EDuration="6.127477376s" podCreationTimestamp="2025-11-28 13:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.126960621 +0000 UTC m=+41.979842411" watchObservedRunningTime="2025-11-28 13:20:31.127477376 +0000 UTC m=+41.980359176" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.165722 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.174787 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6rjdm" podStartSLOduration=18.174772501 podStartE2EDuration="18.174772501s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.17273146 +0000 UTC m=+42.025613260" watchObservedRunningTime="2025-11-28 13:20:31.174772501 +0000 UTC m=+42.027654301" Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.176500 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.676479991 +0000 UTC m=+42.529361791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.179539 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.205336 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" event={"ID":"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b","Type":"ContainerStarted","Data":"a44a3cc2f17dbfe847548067cfe01ee07480d4bdd3691be460ab2cde8532176d"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.205379 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" event={"ID":"8d51dda0-3be6-42ef-a2ff-76d5c2da9b9b","Type":"ContainerStarted","Data":"40fd3c8655178b412c9b5faed05c5ae7f0198efaa6496fa581ae3fa393b454df"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.239345 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p47s7" event={"ID":"dc2cf336-9480-4170-b13a-1f9b0d3cbcba","Type":"ContainerStarted","Data":"16945e588046c3635db0b651a23cec64fe091c220d83c2638bae50d7e8c62218"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.263601 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kdkxk" event={"ID":"a4f08974-1f20-4fe4-a63c-3c69b3064e4b","Type":"ContainerStarted","Data":"98dcb86cffad2408539b29755da0eda47db6be5c11a9c7a08b789b91370ab551"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.267314 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.268259 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.768243037 +0000 UTC m=+42.621124837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.268463 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pzs29" podStartSLOduration=18.268444993 podStartE2EDuration="18.268444993s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.23614973 +0000 UTC m=+42.089031520" watchObservedRunningTime="2025-11-28 13:20:31.268444993 +0000 UTC m=+42.121326793" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.272709 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" event={"ID":"5a56c71c-0c49-4a0a-aee0-2a1ef5936574","Type":"ContainerStarted","Data":"8bba620d67dcbee5caba68f99e057cb91d66441e6ec9520b943653e2bbd71e45"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.281435 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" event={"ID":"0c874537-15c1-4f94-b6cd-086c9c1762f0","Type":"ContainerStarted","Data":"903bc67a3602c2340c11969d990b0127197c9ab56de7ac2ee527b11f7e02a1fc"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.282749 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" event={"ID":"f85acbe7-b253-4d4a-847f-05845804f712","Type":"ContainerStarted","Data":"8c438ba0b16317ec85a20007724c22a1ebe4a2c599809fb16b7fdbcdcbd2c7eb"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.282772 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" event={"ID":"f85acbe7-b253-4d4a-847f-05845804f712","Type":"ContainerStarted","Data":"4f9030da4864cb73ac5cf5603794e62d4d07419f7aba0849671ad898b6584071"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.283415 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" event={"ID":"3e2746d8-5b75-43dc-80f4-81f69c38bb35","Type":"ContainerStarted","Data":"77f678c7e9a59acaecd6086762673eefe658fb0943a27a251a53b2ca6a399800"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.287145 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" event={"ID":"b0e1ef8e-6fc5-491d-b658-b812cc556f67","Type":"ContainerStarted","Data":"97399954603d42d68fa40948f50579051dfc79ead1f14334e2f29928e908c715"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.297921 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vr87" event={"ID":"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0","Type":"ContainerStarted","Data":"a623920f62fa43fcd8c8f07420fad8df5e26d11f0ecbd26b32168249a61ea8d8"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.307579 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9chbz" podStartSLOduration=18.307564376 podStartE2EDuration="18.307564376s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.269196015 +0000 UTC m=+42.122077815" watchObservedRunningTime="2025-11-28 13:20:31.307564376 +0000 UTC m=+42.160446176" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.310034 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" event={"ID":"5998853c-3fbb-403e-b222-5a5c939dbb58","Type":"ContainerStarted","Data":"6da2c10b3c354f245e1bad15330e03a2da0afea7d277217923b3324e5ce99d71"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.323657 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" event={"ID":"e77b7513-27ee-47a7-b39a-a11dd78a0500","Type":"ContainerStarted","Data":"fa278f329300010298a294cdfa6c6852b9e3e9d06cdfcd856c4f2e60896fd46e"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.336698 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" event={"ID":"900e9596-8294-4c4d-857a-1b2bf9adaca7","Type":"ContainerStarted","Data":"c907cca720cf1bfebb42cb4366e30c6dbc83e45aae70d971f2c27188571238b6"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.338716 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" event={"ID":"6a678474-b488-4785-a87c-70df116b33c9","Type":"ContainerStarted","Data":"df31f2095a1ae27ee4793de87fb977a5aae5ae1527db7d986c83f8905aacb15f"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.342982 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" event={"ID":"2347a213-dd3e-4f1c-b36b-a8345bedb927","Type":"ContainerStarted","Data":"f5ee0d95326ff9eb5eacb359f334e43378145151cbf44030695a22ba13147edf"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.347574 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kwnx5" event={"ID":"cbdc9822-68a6-4bff-b373-cac82f25f4d3","Type":"ContainerStarted","Data":"e541881c06de1110cb82112f78fb828c2b5357e45d504e2bdddf5a0dcab8734f"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.363300 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hvcsr" podStartSLOduration=17.363280659 podStartE2EDuration="17.363280659s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.309126062 +0000 UTC m=+42.162007862" watchObservedRunningTime="2025-11-28 13:20:31.363280659 +0000 UTC m=+42.216162459" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.365062 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" event={"ID":"e2bc2c5d-22da-4436-b7bc-0924d7f275f2","Type":"ContainerStarted","Data":"ca086a77c495a777fb8854adc4c740d51ee4244c9f0e6d755e4c51901f19bba6"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.368936 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.369783 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.869769131 +0000 UTC m=+42.722650931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.370861 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6tn57" event={"ID":"973ba3b3-d07b-40ef-8419-40e19838e816","Type":"ContainerStarted","Data":"a3e4f876981ee2e371ee2290c900b6c2170a7066c5307e3698b4a394b2f4935d"} Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.403621 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xzv2b" podStartSLOduration=18.403603248 podStartE2EDuration="18.403603248s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.396105427 +0000 UTC m=+42.248987227" watchObservedRunningTime="2025-11-28 13:20:31.403603248 +0000 UTC m=+42.256485048" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.476072 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.476294 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.976270301 +0000 UTC m=+42.829152101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.476520 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.480420 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:31.980401993 +0000 UTC m=+42.833283783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.494021 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" podStartSLOduration=17.494006574 podStartE2EDuration="17.494006574s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.493368895 +0000 UTC m=+42.346250695" watchObservedRunningTime="2025-11-28 13:20:31.494006574 +0000 UTC m=+42.346888374" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.541678 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dhdzs" podStartSLOduration=18.541661699 podStartE2EDuration="18.541661699s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.533074836 +0000 UTC m=+42.385956656" watchObservedRunningTime="2025-11-28 13:20:31.541661699 +0000 UTC m=+42.394543499" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.571301 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.572798 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6tn57" podStartSLOduration=18.572786227 podStartE2EDuration="18.572786227s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.571612083 +0000 UTC m=+42.424493883" watchObservedRunningTime="2025-11-28 13:20:31.572786227 +0000 UTC m=+42.425668027" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.580901 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.581292 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.081276017 +0000 UTC m=+42.934157817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.584413 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:31 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:31 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:31 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.584450 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.604058 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xqmqh" podStartSLOduration=18.604041659 podStartE2EDuration="18.604041659s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:31.603576675 +0000 UTC m=+42.456458475" watchObservedRunningTime="2025-11-28 13:20:31.604041659 +0000 UTC m=+42.456923459" Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.686200 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.686992 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.186973344 +0000 UTC m=+43.039855144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.789784 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.789976 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.289943581 +0000 UTC m=+43.142825381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.790177 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.791575 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.291563958 +0000 UTC m=+43.144445768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.891354 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.891706 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.391685821 +0000 UTC m=+43.244567621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:31 crc kubenswrapper[4970]: I1128 13:20:31.992609 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:31 crc kubenswrapper[4970]: E1128 13:20:31.992960 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.492944867 +0000 UTC m=+43.345826667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.093640 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.093964 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.593936135 +0000 UTC m=+43.446817935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.094125 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.094416 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.594405309 +0000 UTC m=+43.447287109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.194862 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.195234 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.695200531 +0000 UTC m=+43.548082331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.296183 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.296501 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.796489788 +0000 UTC m=+43.649371588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.386684 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" event={"ID":"849a6379-8100-4799-aa30-06c1359673b7","Type":"ContainerStarted","Data":"016ad987958959d40cea43b0d39093850e4794793e6dfa9cd8550c1d59649331"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.388486 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" event={"ID":"0c874537-15c1-4f94-b6cd-086c9c1762f0","Type":"ContainerStarted","Data":"46581f1f14b18b3fc9f5507bf7966d8d74a4245cce2dc75d7f2e622e05ccf639"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.390030 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kdkxk" event={"ID":"a4f08974-1f20-4fe4-a63c-3c69b3064e4b","Type":"ContainerStarted","Data":"fdf148191cd898d3f2b1ed90f026e5770dc2f46cf2eac74440b4bd7d8bafd702"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.393259 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" event={"ID":"3e2746d8-5b75-43dc-80f4-81f69c38bb35","Type":"ContainerStarted","Data":"695207372c730f40c1685b281d0d7f032bac67a5efee6341daf63be048975337"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.398383 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.398511 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.898482165 +0000 UTC m=+43.751363985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.398900 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.399507 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:32.899479295 +0000 UTC m=+43.752361125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.399974 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" event={"ID":"77028134-907f-445b-8470-d961a46beea4","Type":"ContainerStarted","Data":"a3108d0ee2e23e50fd225f1dfad3020b49eadd956377e98c5e11ec5714701b4e"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.406559 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2qght" event={"ID":"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb","Type":"ContainerStarted","Data":"9cc1a3ebb0f83574059e82ce17595cb54c89e8ff4bcd0695fe0ae6f7d3e441a4"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.408903 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kdkxk" podStartSLOduration=6.408882032 podStartE2EDuration="6.408882032s" podCreationTimestamp="2025-11-28 13:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:32.406843052 +0000 UTC m=+43.259724892" watchObservedRunningTime="2025-11-28 13:20:32.408882032 +0000 UTC m=+43.261763832" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.414324 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" event={"ID":"6a678474-b488-4785-a87c-70df116b33c9","Type":"ContainerStarted","Data":"54e196b62d20ae10b9bfdc5154b6abf32f17659e0beca9bd0d28fa11e879d98b"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.426893 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-86qzl" event={"ID":"2347a213-dd3e-4f1c-b36b-a8345bedb927","Type":"ContainerStarted","Data":"2220bad7bad9d648463696df03901943a9ad6ec463b2c1cc2652618d4ab663b3"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.427082 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kfmkb" podStartSLOduration=18.427046418 podStartE2EDuration="18.427046418s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:32.423121282 +0000 UTC m=+43.276003142" watchObservedRunningTime="2025-11-28 13:20:32.427046418 +0000 UTC m=+43.279928278" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.431182 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p47s7" event={"ID":"dc2cf336-9480-4170-b13a-1f9b0d3cbcba","Type":"ContainerStarted","Data":"6edb73f2b945667f7e91f7057fd99f1194ac87a7898fe552cea37cb4f077ca63"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.436535 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" event={"ID":"5d8594d0-33ac-470e-b2d5-65d3afaa625b","Type":"ContainerStarted","Data":"b99af4a72ee913875876a63b8c7c69835b5e1d6f04beb54a26183f0754deb23a"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.438399 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.440228 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" event={"ID":"e77b7513-27ee-47a7-b39a-a11dd78a0500","Type":"ContainerStarted","Data":"2597a7c59b0176f08cb873d28283d38f112b1082a06a4c14ad0957eacc1e09c7"} Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.440836 4970 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chbz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.440872 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9chbz" podUID="79187155-9c7e-48a9-a3f8-3bcf8d921be6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.447110 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.458420 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ts64w" podStartSLOduration=18.458376091 podStartE2EDuration="18.458376091s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:32.453091566 +0000 UTC m=+43.305973406" watchObservedRunningTime="2025-11-28 13:20:32.458376091 +0000 UTC m=+43.311257971" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.474717 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gdzxc" podStartSLOduration=18.474697323 podStartE2EDuration="18.474697323s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:32.474230709 +0000 UTC m=+43.327112509" watchObservedRunningTime="2025-11-28 13:20:32.474697323 +0000 UTC m=+43.327579123" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.497136 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwp64" podStartSLOduration=19.497117824 podStartE2EDuration="19.497117824s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:32.495478515 +0000 UTC m=+43.348360315" watchObservedRunningTime="2025-11-28 13:20:32.497117824 +0000 UTC m=+43.349999624" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.502727 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.504866 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.004845372 +0000 UTC m=+43.857727212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.526037 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.572856 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:32 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:32 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:32 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.573390 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.581132 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pzs29" Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.606379 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.106359694 +0000 UTC m=+43.959241494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.605810 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.710413 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.710704 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.210651119 +0000 UTC m=+44.063532929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.711092 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.711605 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.211593377 +0000 UTC m=+44.064475177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.827664 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.829004 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.328971749 +0000 UTC m=+44.181853549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:32 crc kubenswrapper[4970]: I1128 13:20:32.930144 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:32 crc kubenswrapper[4970]: E1128 13:20:32.930709 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.430697498 +0000 UTC m=+44.283579298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.031663 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.031773 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.531756218 +0000 UTC m=+44.384638018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.032030 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.032329 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.532320085 +0000 UTC m=+44.385201885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.133859 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.134112 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.634070815 +0000 UTC m=+44.486952615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.134187 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.134670 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.634662253 +0000 UTC m=+44.487544053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.237793 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.238167 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.738144364 +0000 UTC m=+44.591026164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.338976 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.339337 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.839323538 +0000 UTC m=+44.692205338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.440940 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.441934 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:33.941912293 +0000 UTC m=+44.794794093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.471816 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8btsr"] Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.487260 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" event={"ID":"514859c2-bd3c-4ccb-90b0-61180a1bc297","Type":"ContainerStarted","Data":"8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.487945 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.489559 4970 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nccsb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.489608 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.505515 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" event={"ID":"336437f6-aba2-46ae-bf5f-2555d2db13fb","Type":"ContainerStarted","Data":"e93698dd34c0aaa8e6ecbe823cee18b1188d372f6b094f5309b92d49e92d31e8"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.534522 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" event={"ID":"3c65c694-b05d-40db-a754-9b530aadc7a7","Type":"ContainerStarted","Data":"47f3f65ddeca3f0bb17bcb0d43f04202390cad4b2b504821fcb61db9e065f4ba"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.542622 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" event={"ID":"aa0691f5-c326-4370-a718-bc82bcfbdd78","Type":"ContainerStarted","Data":"e5931769e8982b1dc7f643d823e1497fbd39ff91132b3d8b45d3408b86574ada"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.544022 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.544456 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.044440586 +0000 UTC m=+44.897322386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.552707 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" event={"ID":"849a6379-8100-4799-aa30-06c1359673b7","Type":"ContainerStarted","Data":"d68e9cb9f03fae845189a45fee1de7cc35414731cc8e607b7172db4f1729bf23"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.553672 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.570838 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" podStartSLOduration=19.570819934 podStartE2EDuration="19.570819934s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.513503094 +0000 UTC m=+44.366384894" watchObservedRunningTime="2025-11-28 13:20:33.570819934 +0000 UTC m=+44.423701724" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.572168 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" podStartSLOduration=19.572133873 podStartE2EDuration="19.572133873s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.570725481 +0000 UTC m=+44.423607281" watchObservedRunningTime="2025-11-28 13:20:33.572133873 +0000 UTC m=+44.425015663" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.573755 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" event={"ID":"34a23190-6bba-4504-ada1-0724c8a4f1df","Type":"ContainerStarted","Data":"da2d0f2683eab4aa8d95f1e3ec12196b20a81d469370e5ac193d12869e561c98"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.574393 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.574506 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:33 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:33 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:33 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.574534 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.600342 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" event={"ID":"f85acbe7-b253-4d4a-847f-05845804f712","Type":"ContainerStarted","Data":"24f4f0b469744e7faf04da9f2c102a8f2520a2306e67b51f80315c27f8492a41"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.615824 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q9h56" podStartSLOduration=20.615809791 podStartE2EDuration="20.615809791s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.615484601 +0000 UTC m=+44.468366411" watchObservedRunningTime="2025-11-28 13:20:33.615809791 +0000 UTC m=+44.468691591" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.617293 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" podStartSLOduration=19.617285004 podStartE2EDuration="19.617285004s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.590201336 +0000 UTC m=+44.443083136" watchObservedRunningTime="2025-11-28 13:20:33.617285004 +0000 UTC m=+44.470166804" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.623131 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" event={"ID":"23eb3763-e0a8-4069-baa1-79134daba66e","Type":"ContainerStarted","Data":"791f0c1e5c31c9e5f44ff014f0bdc7d9764af3285955acfe3eb95a5d6901a21b"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.644683 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.644826 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.144798656 +0000 UTC m=+44.997680456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.645103 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.646206 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.146198987 +0000 UTC m=+44.999080787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.687741 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" event={"ID":"e2bc2c5d-22da-4436-b7bc-0924d7f275f2","Type":"ContainerStarted","Data":"825d67d7ae7d011ae10369932a4c9d596bd29fe7db49201a0d344b2c52d1f4a4"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.706170 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kwnx5" event={"ID":"cbdc9822-68a6-4bff-b373-cac82f25f4d3","Type":"ContainerStarted","Data":"dbf00287bee154b8b606404c2b7bfe106cffba469d8e010f1a04b112687949da"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.740404 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p47s7" event={"ID":"dc2cf336-9480-4170-b13a-1f9b0d3cbcba","Type":"ContainerStarted","Data":"f19a555a0e6afbaf58c0e0f3686d383859843124fdb446baeb110e75e726f5c7"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.740718 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.746107 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.747122 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.247102052 +0000 UTC m=+45.099983852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.765456 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" event={"ID":"77028134-907f-445b-8470-d961a46beea4","Type":"ContainerStarted","Data":"0ac604dfe6835df41dc3474b8d50ff3d31a36fbbb8f2b5d3a34a2e0eabd87a50"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.771978 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rzrx9" podStartSLOduration=20.771945585 podStartE2EDuration="20.771945585s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.767187235 +0000 UTC m=+44.620069035" watchObservedRunningTime="2025-11-28 13:20:33.771945585 +0000 UTC m=+44.624827375" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.772786 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" podStartSLOduration=19.77278064 podStartE2EDuration="19.77278064s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.687536366 +0000 UTC m=+44.540418166" watchObservedRunningTime="2025-11-28 13:20:33.77278064 +0000 UTC m=+44.625662440" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.778234 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" event={"ID":"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2","Type":"ContainerStarted","Data":"671fa61148bf67534230c0ca5cca4e9f9e870c2997e7c935de145e1b90e3ed1b"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.807513 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" event={"ID":"0c874537-15c1-4f94-b6cd-086c9c1762f0","Type":"ContainerStarted","Data":"5cc6c0f39653cdaa57f60631eb4e479d4c530092d87b2bb5028f94fa8bf9ea96"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.834835 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" event={"ID":"05583a8a-e618-4f75-9ce1-b7ada5fc1ae7","Type":"ContainerStarted","Data":"d0c9bc597052afc3405f476d8bbf287897f714e915e197b93d963a250eb0d03c"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.835679 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.840011 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" event={"ID":"19ff5c79-1e07-4c43-8d35-bdf19869c72b","Type":"ContainerStarted","Data":"99a190f462a86dfcda00b588a17d0f1f66bcda782d2a320c22ef0a811eb36211"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.840972 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2qght" event={"ID":"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb","Type":"ContainerStarted","Data":"5eb73cbab3a1f8eeded5c0cf10bf97e8b3868c4a18d8e2cb03ca5ba39e80e552"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.841877 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" event={"ID":"a60c525e-c2a9-4977-ae16-e2be015eab30","Type":"ContainerStarted","Data":"7927b01ef88e1654907e9a7566b41d1967e5322f1101cbd0076c421cbad8e9ae"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.847447 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" event={"ID":"6ff1b626-d50c-4608-be78-c27b787cc369","Type":"ContainerStarted","Data":"dc977a5aa89835a40144626e7bda2f748e0d2a7d78b2f60685ec7576240f762f"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.848006 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.862580 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.362564557 +0000 UTC m=+45.215446357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.870316 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vr87" event={"ID":"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0","Type":"ContainerStarted","Data":"4c33b2bfdf79a5d87fe1c3e8cb53cb364644cf33e15823688f0ca3c7099838a9"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.872160 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.880131 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" event={"ID":"e80ce492-28d4-40cf-8a55-5a4f456e8255","Type":"ContainerStarted","Data":"e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.882030 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.884743 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7bxck" podStartSLOduration=19.884728241 podStartE2EDuration="19.884728241s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.883381901 +0000 UTC m=+44.736263701" watchObservedRunningTime="2025-11-28 13:20:33.884728241 +0000 UTC m=+44.737610041" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.885302 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p47s7" podStartSLOduration=8.885293797 podStartE2EDuration="8.885293797s" podCreationTimestamp="2025-11-28 13:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.831527372 +0000 UTC m=+44.684409162" watchObservedRunningTime="2025-11-28 13:20:33.885293797 +0000 UTC m=+44.738175607" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.905782 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.931474 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" event={"ID":"900e9596-8294-4c4d-857a-1b2bf9adaca7","Type":"ContainerStarted","Data":"f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.931528 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.941582 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.943368 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m8rgw" podStartSLOduration=20.9433531 podStartE2EDuration="20.9433531s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.941824074 +0000 UTC m=+44.794705874" watchObservedRunningTime="2025-11-28 13:20:33.9433531 +0000 UTC m=+44.796234900" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.949667 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:33 crc kubenswrapper[4970]: E1128 13:20:33.950391 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.450372056 +0000 UTC m=+45.303253856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.961469 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" event={"ID":"5998853c-3fbb-403e-b222-5a5c939dbb58","Type":"ContainerStarted","Data":"677d10c996318dd62b4c52428205acf2aff608bb50bf7ed76827b17476572472"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.967663 4970 generic.go:334] "Generic (PLEG): container finished" podID="5a56c71c-0c49-4a0a-aee0-2a1ef5936574" containerID="d63a651846dd2222d998f66a122cd21d0117354bddb8cfc6327646e337ed56e0" exitCode=0 Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.968672 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" event={"ID":"5a56c71c-0c49-4a0a-aee0-2a1ef5936574","Type":"ContainerDied","Data":"d63a651846dd2222d998f66a122cd21d0117354bddb8cfc6327646e337ed56e0"} Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.972795 4970 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chbz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.972841 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9chbz" podUID="79187155-9c7e-48a9-a3f8-3bcf8d921be6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 28 13:20:33 crc kubenswrapper[4970]: I1128 13:20:33.994683 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kwnx5" podStartSLOduration=20.994665343 podStartE2EDuration="20.994665343s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:33.994506158 +0000 UTC m=+44.847387948" watchObservedRunningTime="2025-11-28 13:20:33.994665343 +0000 UTC m=+44.847547143" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.053492 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.056771 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.556757164 +0000 UTC m=+45.409638964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.066762 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wms6k" podStartSLOduration=20.066743658 podStartE2EDuration="20.066743658s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:34.065653486 +0000 UTC m=+44.918535286" watchObservedRunningTime="2025-11-28 13:20:34.066743658 +0000 UTC m=+44.919625458" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.103655 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" podStartSLOduration=21.103636726 podStartE2EDuration="21.103636726s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:34.101572225 +0000 UTC m=+44.954454025" watchObservedRunningTime="2025-11-28 13:20:34.103636726 +0000 UTC m=+44.956518526" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.154636 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.154795 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.654770704 +0000 UTC m=+45.507652504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.155075 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.155412 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.655404032 +0000 UTC m=+45.508285832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.157052 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xbnhw" podStartSLOduration=20.157039181 podStartE2EDuration="20.157039181s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:34.129133408 +0000 UTC m=+44.982015208" watchObservedRunningTime="2025-11-28 13:20:34.157039181 +0000 UTC m=+45.009920981" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.207446 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" podStartSLOduration=20.207433147 podStartE2EDuration="20.207433147s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:34.158422151 +0000 UTC m=+45.011303951" watchObservedRunningTime="2025-11-28 13:20:34.207433147 +0000 UTC m=+45.060314947" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.207813 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" podStartSLOduration=21.207810068 podStartE2EDuration="21.207810068s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:34.206621543 +0000 UTC m=+45.059503343" watchObservedRunningTime="2025-11-28 13:20:34.207810068 +0000 UTC m=+45.060691868" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.256116 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.256465 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.756450452 +0000 UTC m=+45.609332252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.357615 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.357884 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.857872883 +0000 UTC m=+45.710754683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.458856 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.459393 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:34.959373856 +0000 UTC m=+45.812255656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.554237 4970 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hv5px container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.554498 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" podUID="849a6379-8100-4799-aa30-06c1359673b7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.559972 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.560340 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.060329583 +0000 UTC m=+45.913211383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.571640 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:34 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:34 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:34 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.571714 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.661001 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.661331 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.161315361 +0000 UTC m=+46.014197151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.708965 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rjnn5" podStartSLOduration=20.70877976 podStartE2EDuration="20.70877976s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:34.230297801 +0000 UTC m=+45.083179611" watchObservedRunningTime="2025-11-28 13:20:34.70877976 +0000 UTC m=+45.561661560" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.711249 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jcv9n"] Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.712364 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.714696 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.722253 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcv9n"] Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.762162 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.762549 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.262532766 +0000 UTC m=+46.115414576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.833369 4970 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.862827 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.862994 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-catalog-content\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.863034 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsfh\" (UniqueName: \"kubernetes.io/projected/619af67d-331c-4b38-b536-269ba823fd75-kube-api-access-jnsfh\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.863060 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-utilities\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.863191 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.363177733 +0000 UTC m=+46.216059533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.964776 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-catalog-content\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.964833 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsfh\" (UniqueName: \"kubernetes.io/projected/619af67d-331c-4b38-b536-269ba823fd75-kube-api-access-jnsfh\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.964866 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-utilities\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.964908 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:34 crc kubenswrapper[4970]: E1128 13:20:34.965437 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.465423478 +0000 UTC m=+46.318305278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.965461 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-catalog-content\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.965826 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-utilities\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.998848 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" event={"ID":"5a56c71c-0c49-4a0a-aee0-2a1ef5936574","Type":"ContainerStarted","Data":"96089e4b35d716d5ede43efd3b72de44a53077f934f30f22f08d671377e300b2"} Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.999178 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:34 crc kubenswrapper[4970]: I1128 13:20:34.999905 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsfh\" (UniqueName: \"kubernetes.io/projected/619af67d-331c-4b38-b536-269ba823fd75-kube-api-access-jnsfh\") pod \"certified-operators-jcv9n\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.008723 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" event={"ID":"a60c525e-c2a9-4977-ae16-e2be015eab30","Type":"ContainerStarted","Data":"deece3ac168da9638cf97741602ca927958369e0a719ce2c29f431fe3eca780c"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.017843 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" event={"ID":"19ff5c79-1e07-4c43-8d35-bdf19869c72b","Type":"ContainerStarted","Data":"ce5a64dc39c65bfb38a7e65b3031cd156d1b4031b10c19b373b7ca75c6720111"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.028134 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" event={"ID":"336437f6-aba2-46ae-bf5f-2555d2db13fb","Type":"ContainerStarted","Data":"3ba8f881bec73192e61e275569d00dc015e235105847501e6c7c6e967d484cd3"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.037812 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" event={"ID":"77028134-907f-445b-8470-d961a46beea4","Type":"ContainerStarted","Data":"69d8255bc8421bdb22f9474d222008c4336ecbf5fe48e83ae11ca48a477de437"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.043390 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vr87" event={"ID":"c76cfdfa-c0f1-4f41-8f5b-83161f7f8bf0","Type":"ContainerStarted","Data":"7e93eadc112d0b0991fff47d3a4f0c1bd09a6e4cbd010418545ce7d6f595f2c3"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.061675 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2qght" event={"ID":"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb","Type":"ContainerStarted","Data":"46dd28dca6a32ea082c38a8a328ac9421f5bb739e8cf6b64213191de8a9d7a16"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.065670 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4970]: E1128 13:20:35.065893 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.56586385 +0000 UTC m=+46.418745790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.066258 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:35 crc kubenswrapper[4970]: E1128 13:20:35.066669 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.566661804 +0000 UTC m=+46.419543604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.071328 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" podStartSLOduration=22.071316431 podStartE2EDuration="22.071316431s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.070369673 +0000 UTC m=+45.923251473" watchObservedRunningTime="2025-11-28 13:20:35.071316431 +0000 UTC m=+45.924198231" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.075594 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.081362 4970 generic.go:334] "Generic (PLEG): container finished" podID="78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" containerID="671fa61148bf67534230c0ca5cca4e9f9e870c2997e7c935de145e1b90e3ed1b" exitCode=0 Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.081607 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" event={"ID":"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2","Type":"ContainerDied","Data":"671fa61148bf67534230c0ca5cca4e9f9e870c2997e7c935de145e1b90e3ed1b"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.087395 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" event={"ID":"6ff1b626-d50c-4608-be78-c27b787cc369","Type":"ContainerStarted","Data":"7cacb3bcd65f6b1d26f5493dccf0db38897a6c0cbcefdca41bc5c59eff220b51"} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.088906 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" gracePeriod=30 Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.090373 4970 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nccsb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.090429 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.101521 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" podStartSLOduration=22.101504581 podStartE2EDuration="22.101504581s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.101156861 +0000 UTC m=+45.954038661" watchObservedRunningTime="2025-11-28 13:20:35.101504581 +0000 UTC m=+45.954386381" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.120429 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jvzw"] Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.121451 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.158257 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jvzw"] Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.158855 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4vr87" podStartSLOduration=22.158841212 podStartE2EDuration="22.158841212s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.15470431 +0000 UTC m=+46.007586110" watchObservedRunningTime="2025-11-28 13:20:35.158841212 +0000 UTC m=+46.011723012" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.186800 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4970]: E1128 13:20:35.187124 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.687096725 +0000 UTC m=+46.539978525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.188289 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-utilities\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.188793 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2dl\" (UniqueName: \"kubernetes.io/projected/ea4c7183-d326-44cd-8e40-649e3dad901e-kube-api-access-vr2dl\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.188957 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-catalog-content\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.189232 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:35 crc kubenswrapper[4970]: E1128 13:20:35.192602 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.692587197 +0000 UTC m=+46.545468997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.232607 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w4m45" podStartSLOduration=21.232590477 podStartE2EDuration="21.232590477s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.198278435 +0000 UTC m=+46.051160255" watchObservedRunningTime="2025-11-28 13:20:35.232590477 +0000 UTC m=+46.085472277" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.234599 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pldh4" podStartSLOduration=21.234588256 podStartE2EDuration="21.234588256s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.231805954 +0000 UTC m=+46.084687754" watchObservedRunningTime="2025-11-28 13:20:35.234588256 +0000 UTC m=+46.087470056" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.266592 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdr5z" podStartSLOduration=22.266573469 podStartE2EDuration="22.266573469s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.265756515 +0000 UTC m=+46.118638315" watchObservedRunningTime="2025-11-28 13:20:35.266573469 +0000 UTC m=+46.119455269" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.297146 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.297396 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-catalog-content\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.297477 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-utilities\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.297552 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2dl\" (UniqueName: \"kubernetes.io/projected/ea4c7183-d326-44cd-8e40-649e3dad901e-kube-api-access-vr2dl\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: E1128 13:20:35.297872 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.797858501 +0000 UTC m=+46.650740301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.298255 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-catalog-content\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.298467 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-utilities\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.313959 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4rqzq" podStartSLOduration=21.313944026 podStartE2EDuration="21.313944026s" podCreationTimestamp="2025-11-28 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:35.312170633 +0000 UTC m=+46.165052433" watchObservedRunningTime="2025-11-28 13:20:35.313944026 +0000 UTC m=+46.166825826" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.329129 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2dl\" (UniqueName: \"kubernetes.io/projected/ea4c7183-d326-44cd-8e40-649e3dad901e-kube-api-access-vr2dl\") pod \"certified-operators-5jvzw\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.403028 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:35 crc kubenswrapper[4970]: E1128 13:20:35.403338 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-28 13:20:35.903327291 +0000 UTC m=+46.756209091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjj4p" (UID: "309db78c-54d0-452b-8b62-979217816260") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.456683 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.487616 4970 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-28T13:20:34.833398035Z","Handler":null,"Name":""} Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.506453 4970 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.506493 4970 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.510633 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.585865 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:35 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:35 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:35 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.586196 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.623752 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.672235 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcv9n"] Nov 28 13:20:35 crc kubenswrapper[4970]: W1128 13:20:35.686543 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619af67d_331c_4b38_b536_269ba823fd75.slice/crio-28be4c283cf4514048523c6a051008763a544e99a642d0a9f6dcebb2c99341ed WatchSource:0}: Error finding container 28be4c283cf4514048523c6a051008763a544e99a642d0a9f6dcebb2c99341ed: Status 404 returned error can't find the container with id 28be4c283cf4514048523c6a051008763a544e99a642d0a9f6dcebb2c99341ed Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.710133 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mwg2n"] Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.710999 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.714963 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.715679 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.736062 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwg2n"] Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.816946 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-catalog-content\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.817111 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-utilities\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.817170 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jht9\" (UniqueName: \"kubernetes.io/projected/fdf78924-9472-414e-baf6-822e511c464c-kube-api-access-6jht9\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.873578 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jvzw"] Nov 28 13:20:35 crc kubenswrapper[4970]: W1128 13:20:35.878593 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4c7183_d326_44cd_8e40_649e3dad901e.slice/crio-411d2fc0ceb52772aa19488c50007a1434462d8782dc69f22d7195a1989afc09 WatchSource:0}: Error finding container 411d2fc0ceb52772aa19488c50007a1434462d8782dc69f22d7195a1989afc09: Status 404 returned error can't find the container with id 411d2fc0ceb52772aa19488c50007a1434462d8782dc69f22d7195a1989afc09 Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.918641 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-utilities\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.918712 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jht9\" (UniqueName: \"kubernetes.io/projected/fdf78924-9472-414e-baf6-822e511c464c-kube-api-access-6jht9\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.918788 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-catalog-content\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.919296 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-utilities\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.919481 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-catalog-content\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:35 crc kubenswrapper[4970]: I1128 13:20:35.936774 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jht9\" (UniqueName: \"kubernetes.io/projected/fdf78924-9472-414e-baf6-822e511c464c-kube-api-access-6jht9\") pod \"community-operators-mwg2n\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.047828 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.090535 4970 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hv5px container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.090591 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" podUID="849a6379-8100-4799-aa30-06c1359673b7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.096511 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jvzw" event={"ID":"ea4c7183-d326-44cd-8e40-649e3dad901e","Type":"ContainerStarted","Data":"411d2fc0ceb52772aa19488c50007a1434462d8782dc69f22d7195a1989afc09"} Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.098764 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2qght" event={"ID":"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb","Type":"ContainerStarted","Data":"5c2d7821b2d8e12fcd449160be7f2cf643f996e6a9c0f7b2a02a816052ec7fc0"} Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.099753 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcv9n" event={"ID":"619af67d-331c-4b38-b536-269ba823fd75","Type":"ContainerStarted","Data":"28be4c283cf4514048523c6a051008763a544e99a642d0a9f6dcebb2c99341ed"} Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.104954 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4xkm"] Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.105930 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.108545 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.123517 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4xkm"] Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.222773 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-utilities\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.223014 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-644lr\" (UniqueName: \"kubernetes.io/projected/507a4c7b-eff8-4695-a222-3a40b0483eb8-kube-api-access-644lr\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.223047 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-catalog-content\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.287966 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwg2n"] Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.310495 4970 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.310546 4970 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.324283 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-644lr\" (UniqueName: \"kubernetes.io/projected/507a4c7b-eff8-4695-a222-3a40b0483eb8-kube-api-access-644lr\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.324376 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-catalog-content\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.324468 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-utilities\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.324985 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-utilities\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.325439 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-catalog-content\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.362883 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-644lr\" (UniqueName: \"kubernetes.io/projected/507a4c7b-eff8-4695-a222-3a40b0483eb8-kube-api-access-644lr\") pod \"community-operators-l4xkm\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.369540 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hv5px" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.443024 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.484209 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjj4p\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.572872 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:36 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:36 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:36 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.573099 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.573797 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.629880 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp4gq\" (UniqueName: \"kubernetes.io/projected/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-kube-api-access-cp4gq\") pod \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.629990 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-secret-volume\") pod \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.630027 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume\") pod \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\" (UID: \"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2\") " Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.630981 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" (UID: "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.640637 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-kube-api-access-cp4gq" (OuterVolumeSpecName: "kube-api-access-cp4gq") pod "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" (UID: "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2"). InnerVolumeSpecName "kube-api-access-cp4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.640820 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" (UID: "78210a9d-d2ee-4d21-a0e5-956cb8fd85d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.731663 4970 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.731692 4970 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.731701 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp4gq\" (UniqueName: \"kubernetes.io/projected/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2-kube-api-access-cp4gq\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.789974 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:36 crc kubenswrapper[4970]: I1128 13:20:36.814988 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4xkm"] Nov 28 13:20:36 crc kubenswrapper[4970]: W1128 13:20:36.834908 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507a4c7b_eff8_4695_a222_3a40b0483eb8.slice/crio-f61c5c9e287118e35418471d83ff2cda4c01998b8748e8aa88947f45d43a5a40 WatchSource:0}: Error finding container f61c5c9e287118e35418471d83ff2cda4c01998b8748e8aa88947f45d43a5a40: Status 404 returned error can't find the container with id f61c5c9e287118e35418471d83ff2cda4c01998b8748e8aa88947f45d43a5a40 Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.081938 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 28 13:20:37 crc kubenswrapper[4970]: E1128 13:20:37.082129 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" containerName="collect-profiles" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.082141 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" containerName="collect-profiles" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.082235 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" containerName="collect-profiles" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.082569 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.084270 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.085839 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.092896 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.132760 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2qght" event={"ID":"24797c3c-a0a6-4bab-b61e-dcc1aaedcccb","Type":"ContainerStarted","Data":"8fcd50d78074f61737e01e12b8c17891e93a75a486bab6e30b6f5867937bdcf0"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.134888 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" event={"ID":"78210a9d-d2ee-4d21-a0e5-956cb8fd85d2","Type":"ContainerDied","Data":"c169f5c9f508bf86308ea14824e6e46b1c74e2ec94d26067c6007aa80b1b5494"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.134912 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c169f5c9f508bf86308ea14824e6e46b1c74e2ec94d26067c6007aa80b1b5494" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.134947 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.141331 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.141388 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.142163 4970 generic.go:334] "Generic (PLEG): container finished" podID="619af67d-331c-4b38-b536-269ba823fd75" containerID="724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1" exitCode=0 Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.142238 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcv9n" event={"ID":"619af67d-331c-4b38-b536-269ba823fd75","Type":"ContainerDied","Data":"724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.143884 4970 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.144722 4970 generic.go:334] "Generic (PLEG): container finished" podID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerID="22190e6d1ceed4e3ebb077d9a6db4582561199a9eeb757785c712980390d51da" exitCode=0 Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.144795 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xkm" event={"ID":"507a4c7b-eff8-4695-a222-3a40b0483eb8","Type":"ContainerDied","Data":"22190e6d1ceed4e3ebb077d9a6db4582561199a9eeb757785c712980390d51da"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.144829 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xkm" event={"ID":"507a4c7b-eff8-4695-a222-3a40b0483eb8","Type":"ContainerStarted","Data":"f61c5c9e287118e35418471d83ff2cda4c01998b8748e8aa88947f45d43a5a40"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.146379 4970 generic.go:334] "Generic (PLEG): container finished" podID="fdf78924-9472-414e-baf6-822e511c464c" containerID="d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931" exitCode=0 Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.146421 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerDied","Data":"d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.146438 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerStarted","Data":"dc6ae09621ca3f4bb4d17e2272a3fa66143085d76106685f129f884cf64db7fe"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.148560 4970 generic.go:334] "Generic (PLEG): container finished" podID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerID="e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3" exitCode=0 Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.148662 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jvzw" event={"ID":"ea4c7183-d326-44cd-8e40-649e3dad901e","Type":"ContainerDied","Data":"e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3"} Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.158701 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2qght" podStartSLOduration=12.158691253 podStartE2EDuration="12.158691253s" podCreationTimestamp="2025-11-28 13:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:37.157953971 +0000 UTC m=+48.010835771" watchObservedRunningTime="2025-11-28 13:20:37.158691253 +0000 UTC m=+48.011573053" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.243335 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.243690 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.245317 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.260487 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjj4p"] Nov 28 13:20:37 crc kubenswrapper[4970]: W1128 13:20:37.269929 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309db78c_54d0_452b_8b62_979217816260.slice/crio-de9d7f5e240df14e268e2a1b1bfab21dc4f1e2f9cfe1b777dba65eb4a8825fd0 WatchSource:0}: Error finding container de9d7f5e240df14e268e2a1b1bfab21dc4f1e2f9cfe1b777dba65eb4a8825fd0: Status 404 returned error can't find the container with id de9d7f5e240df14e268e2a1b1bfab21dc4f1e2f9cfe1b777dba65eb4a8825fd0 Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.274461 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.389682 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.394356 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.511151 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8hjm2"] Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.512286 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.517489 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.531970 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7nxhr" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.547996 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hjm2"] Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.557755 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/ac87c2e3-5a6b-4998-8db7-165e571f6f52-kube-api-access-5h75m\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.557796 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-utilities\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.557851 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-catalog-content\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.571233 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:37 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:37 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:37 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.571288 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.658840 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/ac87c2e3-5a6b-4998-8db7-165e571f6f52-kube-api-access-5h75m\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.659157 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-utilities\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.659256 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-catalog-content\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.659828 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-utilities\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.659914 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-catalog-content\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.679889 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/ac87c2e3-5a6b-4998-8db7-165e571f6f52-kube-api-access-5h75m\") pod \"redhat-marketplace-8hjm2\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.738433 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.828097 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.910563 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nzvwp"] Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.911797 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.917299 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzvwp"] Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.963282 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr975\" (UniqueName: \"kubernetes.io/projected/4cae710a-4284-4d76-b507-d7aa55adba72-kube-api-access-pr975\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.963673 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-catalog-content\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:37 crc kubenswrapper[4970]: I1128 13:20:37.963706 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-utilities\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.049446 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.049491 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.056646 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.064992 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-catalog-content\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.065051 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-utilities\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.065153 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr975\" (UniqueName: \"kubernetes.io/projected/4cae710a-4284-4d76-b507-d7aa55adba72-kube-api-access-pr975\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.065619 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-catalog-content\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.065797 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-utilities\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.088844 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr975\" (UniqueName: \"kubernetes.io/projected/4cae710a-4284-4d76-b507-d7aa55adba72-kube-api-access-pr975\") pod \"redhat-marketplace-nzvwp\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.153974 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52b6cb10-d2a3-4df9-b76b-ef5e293f699d","Type":"ContainerStarted","Data":"579565ae48fd77b4b39b6be68cf6e0afa7bc3585de499a1ea8f13b716b95a1c9"} Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.155625 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" event={"ID":"309db78c-54d0-452b-8b62-979217816260","Type":"ContainerStarted","Data":"58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a"} Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.155656 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" event={"ID":"309db78c-54d0-452b-8b62-979217816260","Type":"ContainerStarted","Data":"de9d7f5e240df14e268e2a1b1bfab21dc4f1e2f9cfe1b777dba65eb4a8825fd0"} Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.161032 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-78pq5" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.178698 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" podStartSLOduration=25.1786779 podStartE2EDuration="25.1786779s" podCreationTimestamp="2025-11-28 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:38.175607999 +0000 UTC m=+49.028489809" watchObservedRunningTime="2025-11-28 13:20:38.1786779 +0000 UTC m=+49.031559710" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.188639 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.189040 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.195897 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.230294 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.281869 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hjm2"] Nov 28 13:20:38 crc kubenswrapper[4970]: W1128 13:20:38.295600 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac87c2e3_5a6b_4998_8db7_165e571f6f52.slice/crio-9f6690e4b3da7cb1949cd7350c0cd605e2b753e77d450c0302323c8b4f5b87e0 WatchSource:0}: Error finding container 9f6690e4b3da7cb1949cd7350c0cd605e2b753e77d450c0302323c8b4f5b87e0: Status 404 returned error can't find the container with id 9f6690e4b3da7cb1949cd7350c0cd605e2b753e77d450c0302323c8b4f5b87e0 Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.494882 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzvwp"] Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.503551 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wwr74"] Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.506301 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.508864 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.511744 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwr74"] Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.513796 4970 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chbz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.513832 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9chbz" podUID="79187155-9c7e-48a9-a3f8-3bcf8d921be6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.514041 4970 patch_prober.go:28] interesting pod/downloads-7954f5f757-9chbz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.514438 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9chbz" podUID="79187155-9c7e-48a9-a3f8-3bcf8d921be6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.572018 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-utilities\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.572051 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.572057 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-catalog-content\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.572184 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjh5c\" (UniqueName: \"kubernetes.io/projected/a92f4929-dcab-4362-a5f2-c648f274bf04-kube-api-access-fjh5c\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.575308 4970 patch_prober.go:28] interesting pod/router-default-5444994796-6tn57 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 28 13:20:38 crc kubenswrapper[4970]: [-]has-synced failed: reason withheld Nov 28 13:20:38 crc kubenswrapper[4970]: [+]process-running ok Nov 28 13:20:38 crc kubenswrapper[4970]: healthz check failed Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.575383 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6tn57" podUID="973ba3b3-d07b-40ef-8419-40e19838e816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.673479 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjh5c\" (UniqueName: \"kubernetes.io/projected/a92f4929-dcab-4362-a5f2-c648f274bf04-kube-api-access-fjh5c\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.673539 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-utilities\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.673560 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-catalog-content\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.674620 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-catalog-content\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.675038 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-utilities\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.695530 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjh5c\" (UniqueName: \"kubernetes.io/projected/a92f4929-dcab-4362-a5f2-c648f274bf04-kube-api-access-fjh5c\") pod \"redhat-operators-wwr74\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.828116 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.828176 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:38 crc kubenswrapper[4970]: E1128 13:20:38.828885 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.835864 4970 patch_prober.go:28] interesting pod/console-f9d7485db-kwnx5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.835988 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kwnx5" podUID="cbdc9822-68a6-4bff-b373-cac82f25f4d3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.841334 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:20:38 crc kubenswrapper[4970]: E1128 13:20:38.842658 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:38 crc kubenswrapper[4970]: E1128 13:20:38.844698 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:38 crc kubenswrapper[4970]: E1128 13:20:38.844779 4970 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.916611 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fp5dk"] Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.918427 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.940064 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp5dk"] Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.977727 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-catalog-content\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.977769 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdkxs\" (UniqueName: \"kubernetes.io/projected/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-kube-api-access-tdkxs\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:38 crc kubenswrapper[4970]: I1128 13:20:38.977841 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-utilities\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.079506 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-utilities\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.079618 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-catalog-content\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.079644 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdkxs\" (UniqueName: \"kubernetes.io/projected/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-kube-api-access-tdkxs\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.080589 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-catalog-content\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.087404 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-utilities\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.101142 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdkxs\" (UniqueName: \"kubernetes.io/projected/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-kube-api-access-tdkxs\") pod \"redhat-operators-fp5dk\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.171089 4970 generic.go:334] "Generic (PLEG): container finished" podID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerID="ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228" exitCode=0 Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.171174 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hjm2" event={"ID":"ac87c2e3-5a6b-4998-8db7-165e571f6f52","Type":"ContainerDied","Data":"ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228"} Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.171203 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hjm2" event={"ID":"ac87c2e3-5a6b-4998-8db7-165e571f6f52","Type":"ContainerStarted","Data":"9f6690e4b3da7cb1949cd7350c0cd605e2b753e77d450c0302323c8b4f5b87e0"} Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.176099 4970 generic.go:334] "Generic (PLEG): container finished" podID="52b6cb10-d2a3-4df9-b76b-ef5e293f699d" containerID="9ae3ce704520cd32a23b41215f5129b20a4f3d2ba1919bd66ff407bb2d2fc921" exitCode=0 Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.176181 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52b6cb10-d2a3-4df9-b76b-ef5e293f699d","Type":"ContainerDied","Data":"9ae3ce704520cd32a23b41215f5129b20a4f3d2ba1919bd66ff407bb2d2fc921"} Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.178647 4970 generic.go:334] "Generic (PLEG): container finished" podID="4cae710a-4284-4d76-b507-d7aa55adba72" containerID="b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5" exitCode=0 Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.178755 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzvwp" event={"ID":"4cae710a-4284-4d76-b507-d7aa55adba72","Type":"ContainerDied","Data":"b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5"} Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.178804 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzvwp" event={"ID":"4cae710a-4284-4d76-b507-d7aa55adba72","Type":"ContainerStarted","Data":"608e565eced9fac804dc65085933b01f5b3309ad1e04b99b18e6465efaf67484"} Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.180405 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.192502 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5kwk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.234877 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.319737 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwr74"] Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.572033 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.576312 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6tn57" Nov 28 13:20:39 crc kubenswrapper[4970]: I1128 13:20:39.591565 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp5dk"] Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.167460 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.189240 4970 generic.go:334] "Generic (PLEG): container finished" podID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerID="32586cf197490c07dac50c81b83615ac99ca6abe43778b63e072f8737d6d7b65" exitCode=0 Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.189326 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerDied","Data":"32586cf197490c07dac50c81b83615ac99ca6abe43778b63e072f8737d6d7b65"} Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.189380 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerStarted","Data":"48f75a5fce6a9a368f5ce73833ee59ddcc53540ea63ec8ccb6d954e252cfd5b4"} Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.190966 4970 generic.go:334] "Generic (PLEG): container finished" podID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerID="4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343" exitCode=0 Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.191467 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwr74" event={"ID":"a92f4929-dcab-4362-a5f2-c648f274bf04","Type":"ContainerDied","Data":"4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343"} Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.191509 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwr74" event={"ID":"a92f4929-dcab-4362-a5f2-c648f274bf04","Type":"ContainerStarted","Data":"0f19a79a745538c2959f7eaf13d5862d155dbeea72a7c516e03cfa85ed864e9a"} Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.193093 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.225339 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.22532053 podStartE2EDuration="225.32053ms" podCreationTimestamp="2025-11-28 13:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:20:40.222374313 +0000 UTC m=+51.075256113" watchObservedRunningTime="2025-11-28 13:20:40.22532053 +0000 UTC m=+51.078202330" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.423679 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.514371 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kube-api-access\") pod \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.514438 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kubelet-dir\") pod \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\" (UID: \"52b6cb10-d2a3-4df9-b76b-ef5e293f699d\") " Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.514902 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52b6cb10-d2a3-4df9-b76b-ef5e293f699d" (UID: "52b6cb10-d2a3-4df9-b76b-ef5e293f699d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.520053 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52b6cb10-d2a3-4df9-b76b-ef5e293f699d" (UID: "52b6cb10-d2a3-4df9-b76b-ef5e293f699d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.616200 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.616248 4970 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52b6cb10-d2a3-4df9-b76b-ef5e293f699d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.664884 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 28 13:20:40 crc kubenswrapper[4970]: E1128 13:20:40.665155 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b6cb10-d2a3-4df9-b76b-ef5e293f699d" containerName="pruner" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.665167 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b6cb10-d2a3-4df9-b76b-ef5e293f699d" containerName="pruner" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.665289 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b6cb10-d2a3-4df9-b76b-ef5e293f699d" containerName="pruner" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.665704 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.667580 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.667935 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.671919 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.830721 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/171af68c-5bcc-458f-9ec6-28bb994b3d55-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.830874 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/171af68c-5bcc-458f-9ec6-28bb994b3d55-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.932048 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/171af68c-5bcc-458f-9ec6-28bb994b3d55-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.932183 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/171af68c-5bcc-458f-9ec6-28bb994b3d55-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.932207 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/171af68c-5bcc-458f-9ec6-28bb994b3d55-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.959369 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/171af68c-5bcc-458f-9ec6-28bb994b3d55-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:40 crc kubenswrapper[4970]: I1128 13:20:40.978874 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:41 crc kubenswrapper[4970]: I1128 13:20:41.182049 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 28 13:20:41 crc kubenswrapper[4970]: I1128 13:20:41.199670 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 28 13:20:41 crc kubenswrapper[4970]: I1128 13:20:41.202373 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52b6cb10-d2a3-4df9-b76b-ef5e293f699d","Type":"ContainerDied","Data":"579565ae48fd77b4b39b6be68cf6e0afa7bc3585de499a1ea8f13b716b95a1c9"} Nov 28 13:20:41 crc kubenswrapper[4970]: I1128 13:20:41.202412 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579565ae48fd77b4b39b6be68cf6e0afa7bc3585de499a1ea8f13b716b95a1c9" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.210523 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"171af68c-5bcc-458f-9ec6-28bb994b3d55","Type":"ContainerStarted","Data":"672b4058a6cc00c7f9301476a877fb04a868a02ed9e179cfb6c19daba827387a"} Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.210783 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"171af68c-5bcc-458f-9ec6-28bb994b3d55","Type":"ContainerStarted","Data":"fd83c25434e176152c5bce60e66836214c65467dab9e2bde36423b60a9e306e9"} Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.264725 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.264806 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.264835 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.265555 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.270163 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.270646 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.271108 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.316760 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.388450 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.394102 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:42 crc kubenswrapper[4970]: I1128 13:20:42.401206 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 28 13:20:43 crc kubenswrapper[4970]: W1128 13:20:43.054994 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-6d7c897eaadf7aa436a42102097ece5cf6242bbd77611269ca63d095d6eda104 WatchSource:0}: Error finding container 6d7c897eaadf7aa436a42102097ece5cf6242bbd77611269ca63d095d6eda104: Status 404 returned error can't find the container with id 6d7c897eaadf7aa436a42102097ece5cf6242bbd77611269ca63d095d6eda104 Nov 28 13:20:43 crc kubenswrapper[4970]: W1128 13:20:43.196823 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b34dc5a60f691663aff1327577b981011de60bb9d88ae207ac695e9ae87b7cd6 WatchSource:0}: Error finding container b34dc5a60f691663aff1327577b981011de60bb9d88ae207ac695e9ae87b7cd6: Status 404 returned error can't find the container with id b34dc5a60f691663aff1327577b981011de60bb9d88ae207ac695e9ae87b7cd6 Nov 28 13:20:43 crc kubenswrapper[4970]: I1128 13:20:43.222813 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b34dc5a60f691663aff1327577b981011de60bb9d88ae207ac695e9ae87b7cd6"} Nov 28 13:20:43 crc kubenswrapper[4970]: I1128 13:20:43.224107 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6d7c897eaadf7aa436a42102097ece5cf6242bbd77611269ca63d095d6eda104"} Nov 28 13:20:43 crc kubenswrapper[4970]: I1128 13:20:43.227200 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"488d879e80e2eadb035ce778bdcfce90691972b237407fbda703d2529285415a"} Nov 28 13:20:43 crc kubenswrapper[4970]: I1128 13:20:43.784552 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p47s7" Nov 28 13:20:44 crc kubenswrapper[4970]: I1128 13:20:44.234775 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"035d0bea73078d28fe1f994d1a3eac16b364c88f1bdc1bb2a07e0af652b70b53"} Nov 28 13:20:44 crc kubenswrapper[4970]: I1128 13:20:44.251514 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bb7415910f36555cda69f87c985b4804906e6a40f88cec28265956e64a4ed5bd"} Nov 28 13:20:44 crc kubenswrapper[4970]: I1128 13:20:44.251837 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:20:44 crc kubenswrapper[4970]: I1128 13:20:44.253970 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"834f0a9fdb2fe310988377673b9236f7f7358e4bbaf04d43c9660cbfd5640963"} Nov 28 13:20:44 crc kubenswrapper[4970]: I1128 13:20:44.257774 4970 generic.go:334] "Generic (PLEG): container finished" podID="171af68c-5bcc-458f-9ec6-28bb994b3d55" containerID="672b4058a6cc00c7f9301476a877fb04a868a02ed9e179cfb6c19daba827387a" exitCode=0 Nov 28 13:20:44 crc kubenswrapper[4970]: I1128 13:20:44.257806 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"171af68c-5bcc-458f-9ec6-28bb994b3d55","Type":"ContainerDied","Data":"672b4058a6cc00c7f9301476a877fb04a868a02ed9e179cfb6c19daba827387a"} Nov 28 13:20:46 crc kubenswrapper[4970]: I1128 13:20:46.565925 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:20:48 crc kubenswrapper[4970]: I1128 13:20:48.520342 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9chbz" Nov 28 13:20:48 crc kubenswrapper[4970]: E1128 13:20:48.828276 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:48 crc kubenswrapper[4970]: E1128 13:20:48.830406 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:48 crc kubenswrapper[4970]: I1128 13:20:48.831653 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:48 crc kubenswrapper[4970]: E1128 13:20:48.831700 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:48 crc kubenswrapper[4970]: E1128 13:20:48.832358 4970 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:20:48 crc kubenswrapper[4970]: I1128 13:20:48.913414 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kwnx5" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.301954 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.313618 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"171af68c-5bcc-458f-9ec6-28bb994b3d55","Type":"ContainerDied","Data":"fd83c25434e176152c5bce60e66836214c65467dab9e2bde36423b60a9e306e9"} Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.313652 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd83c25434e176152c5bce60e66836214c65467dab9e2bde36423b60a9e306e9" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.313647 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.433409 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/171af68c-5bcc-458f-9ec6-28bb994b3d55-kubelet-dir\") pod \"171af68c-5bcc-458f-9ec6-28bb994b3d55\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.433459 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/171af68c-5bcc-458f-9ec6-28bb994b3d55-kube-api-access\") pod \"171af68c-5bcc-458f-9ec6-28bb994b3d55\" (UID: \"171af68c-5bcc-458f-9ec6-28bb994b3d55\") " Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.433491 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/171af68c-5bcc-458f-9ec6-28bb994b3d55-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "171af68c-5bcc-458f-9ec6-28bb994b3d55" (UID: "171af68c-5bcc-458f-9ec6-28bb994b3d55"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.433711 4970 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/171af68c-5bcc-458f-9ec6-28bb994b3d55-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.445426 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171af68c-5bcc-458f-9ec6-28bb994b3d55-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "171af68c-5bcc-458f-9ec6-28bb994b3d55" (UID: "171af68c-5bcc-458f-9ec6-28bb994b3d55"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:20:52 crc kubenswrapper[4970]: I1128 13:20:52.535432 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/171af68c-5bcc-458f-9ec6-28bb994b3d55-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:20:56 crc kubenswrapper[4970]: I1128 13:20:56.796343 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:20:58 crc kubenswrapper[4970]: E1128 13:20:58.829409 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:58 crc kubenswrapper[4970]: E1128 13:20:58.831601 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:58 crc kubenswrapper[4970]: E1128 13:20:58.833520 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:20:58 crc kubenswrapper[4970]: E1128 13:20:58.833574 4970 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:21:05 crc kubenswrapper[4970]: I1128 13:21:05.410299 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 28 13:21:06 crc kubenswrapper[4970]: I1128 13:21:06.396059 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-8btsr_a5b9dda0-da70-4e7c-850b-de8b7744a15c/kube-multus-additional-cni-plugins/0.log" Nov 28 13:21:06 crc kubenswrapper[4970]: I1128 13:21:06.396101 4970 generic.go:334] "Generic (PLEG): container finished" podID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" exitCode=137 Nov 28 13:21:06 crc kubenswrapper[4970]: I1128 13:21:06.396151 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" event={"ID":"a5b9dda0-da70-4e7c-850b-de8b7744a15c","Type":"ContainerDied","Data":"2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25"} Nov 28 13:21:08 crc kubenswrapper[4970]: E1128 13:21:08.826061 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:08 crc kubenswrapper[4970]: E1128 13:21:08.826421 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:08 crc kubenswrapper[4970]: E1128 13:21:08.826747 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:08 crc kubenswrapper[4970]: E1128 13:21:08.826837 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:21:09 crc kubenswrapper[4970]: I1128 13:21:09.064118 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t7pk6" Nov 28 13:21:09 crc kubenswrapper[4970]: I1128 13:21:09.086558 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.086530454 podStartE2EDuration="4.086530454s" podCreationTimestamp="2025-11-28 13:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:21:09.083683389 +0000 UTC m=+79.936565219" watchObservedRunningTime="2025-11-28 13:21:09.086530454 +0000 UTC m=+79.939412284" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.659557 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 28 13:21:15 crc kubenswrapper[4970]: E1128 13:21:15.660596 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171af68c-5bcc-458f-9ec6-28bb994b3d55" containerName="pruner" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.660628 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="171af68c-5bcc-458f-9ec6-28bb994b3d55" containerName="pruner" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.661016 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="171af68c-5bcc-458f-9ec6-28bb994b3d55" containerName="pruner" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.663361 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.667347 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff17b14e-1701-403e-ae66-0a20d0fc3792-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.667775 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff17b14e-1701-403e-ae66-0a20d0fc3792-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.668286 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.669272 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.674193 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.769395 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff17b14e-1701-403e-ae66-0a20d0fc3792-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.769511 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff17b14e-1701-403e-ae66-0a20d0fc3792-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.769583 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff17b14e-1701-403e-ae66-0a20d0fc3792-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.802884 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff17b14e-1701-403e-ae66-0a20d0fc3792-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:15 crc kubenswrapper[4970]: I1128 13:21:15.998371 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:21:18 crc kubenswrapper[4970]: E1128 13:21:18.827116 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:18 crc kubenswrapper[4970]: E1128 13:21:18.828344 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:18 crc kubenswrapper[4970]: E1128 13:21:18.828996 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:18 crc kubenswrapper[4970]: E1128 13:21:18.829078 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.259841 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.262438 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.268053 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.331647 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-var-lock\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.331926 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.332023 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a609cfd-28dc-440a-8433-6933565864a7-kube-api-access\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.433401 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-var-lock\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.433865 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.433995 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.433625 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-var-lock\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.434328 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a609cfd-28dc-440a-8433-6933565864a7-kube-api-access\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.467149 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a609cfd-28dc-440a-8433-6933565864a7-kube-api-access\") pod \"installer-9-crc\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:20 crc kubenswrapper[4970]: I1128 13:21:20.608649 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:21:23 crc kubenswrapper[4970]: I1128 13:21:23.014698 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 28 13:21:28 crc kubenswrapper[4970]: I1128 13:21:28.406306 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 28 13:21:28 crc kubenswrapper[4970]: E1128 13:21:28.827108 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:28 crc kubenswrapper[4970]: E1128 13:21:28.827675 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:28 crc kubenswrapper[4970]: E1128 13:21:28.828139 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:28 crc kubenswrapper[4970]: E1128 13:21:28.828295 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:21:29 crc kubenswrapper[4970]: E1128 13:21:29.079098 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 28 13:21:29 crc kubenswrapper[4970]: E1128 13:21:29.079355 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-644lr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l4xkm_openshift-marketplace(507a4c7b-eff8-4695-a222-3a40b0483eb8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:21:29 crc kubenswrapper[4970]: E1128 13:21:29.080728 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l4xkm" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" Nov 28 13:21:29 crc kubenswrapper[4970]: I1128 13:21:29.433933 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.433917324 podStartE2EDuration="1.433917324s" podCreationTimestamp="2025-11-28 13:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:21:29.432206563 +0000 UTC m=+100.285088413" watchObservedRunningTime="2025-11-28 13:21:29.433917324 +0000 UTC m=+100.286799124" Nov 28 13:21:38 crc kubenswrapper[4970]: E1128 13:21:38.826877 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:38 crc kubenswrapper[4970]: E1128 13:21:38.827882 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:38 crc kubenswrapper[4970]: E1128 13:21:38.828374 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:38 crc kubenswrapper[4970]: E1128 13:21:38.828417 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:21:48 crc kubenswrapper[4970]: E1128 13:21:48.829530 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:48 crc kubenswrapper[4970]: E1128 13:21:48.834705 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:48 crc kubenswrapper[4970]: E1128 13:21:48.835423 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:48 crc kubenswrapper[4970]: E1128 13:21:48.835516 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:21:58 crc kubenswrapper[4970]: E1128 13:21:58.827292 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:58 crc kubenswrapper[4970]: E1128 13:21:58.828675 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:58 crc kubenswrapper[4970]: E1128 13:21:58.829199 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:21:58 crc kubenswrapper[4970]: E1128 13:21:58.829293 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:22:05 crc kubenswrapper[4970]: E1128 13:22:05.692361 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 28 13:22:05 crc kubenswrapper[4970]: E1128 13:22:05.693194 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jht9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mwg2n_openshift-marketplace(fdf78924-9472-414e-baf6-822e511c464c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:05 crc kubenswrapper[4970]: E1128 13:22:05.694568 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mwg2n" podUID="fdf78924-9472-414e-baf6-822e511c464c" Nov 28 13:22:07 crc kubenswrapper[4970]: E1128 13:22:07.469048 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mwg2n" podUID="fdf78924-9472-414e-baf6-822e511c464c" Nov 28 13:22:07 crc kubenswrapper[4970]: E1128 13:22:07.544333 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 28 13:22:07 crc kubenswrapper[4970]: E1128 13:22:07.544592 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5h75m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8hjm2_openshift-marketplace(ac87c2e3-5a6b-4998-8db7-165e571f6f52): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:07 crc kubenswrapper[4970]: E1128 13:22:07.545966 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8hjm2" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.827156 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.828362 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.828824 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" cmd=["/bin/bash","-c","test -f /ready/ready"] Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.828865 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.885191 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8hjm2" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.952399 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.952532 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vr2dl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5jvzw_openshift-marketplace(ea4c7183-d326-44cd-8e40-649e3dad901e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.953964 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5jvzw" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.960668 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.960791 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pr975,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nzvwp_openshift-marketplace(4cae710a-4284-4d76-b507-d7aa55adba72): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.962847 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nzvwp" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.964431 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.964528 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnsfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jcv9n_openshift-marketplace(619af67d-331c-4b38-b536-269ba823fd75): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:08 crc kubenswrapper[4970]: E1128 13:22:08.965641 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jcv9n" podUID="619af67d-331c-4b38-b536-269ba823fd75" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.500105 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nzvwp" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.500182 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5jvzw" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.500547 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jcv9n" podUID="619af67d-331c-4b38-b536-269ba823fd75" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.539171 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.539397 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjh5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wwr74_openshift-marketplace(a92f4929-dcab-4362-a5f2-c648f274bf04): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.540834 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wwr74" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.565381 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.565554 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdkxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fp5dk_openshift-marketplace(7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 28 13:22:11 crc kubenswrapper[4970]: E1128 13:22:11.566881 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fp5dk" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.601828 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-8btsr_a5b9dda0-da70-4e7c-850b-de8b7744a15c/kube-multus-additional-cni-plugins/0.log" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.602150 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.682803 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist\") pod \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.682865 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5b9dda0-da70-4e7c-850b-de8b7744a15c-tuning-conf-dir\") pod \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.682919 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a5b9dda0-da70-4e7c-850b-de8b7744a15c-ready\") pod \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.682969 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx24w\" (UniqueName: \"kubernetes.io/projected/a5b9dda0-da70-4e7c-850b-de8b7744a15c-kube-api-access-gx24w\") pod \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\" (UID: \"a5b9dda0-da70-4e7c-850b-de8b7744a15c\") " Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.683323 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5b9dda0-da70-4e7c-850b-de8b7744a15c-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "a5b9dda0-da70-4e7c-850b-de8b7744a15c" (UID: "a5b9dda0-da70-4e7c-850b-de8b7744a15c"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.683761 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5b9dda0-da70-4e7c-850b-de8b7744a15c-ready" (OuterVolumeSpecName: "ready") pod "a5b9dda0-da70-4e7c-850b-de8b7744a15c" (UID: "a5b9dda0-da70-4e7c-850b-de8b7744a15c"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.683976 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "a5b9dda0-da70-4e7c-850b-de8b7744a15c" (UID: "a5b9dda0-da70-4e7c-850b-de8b7744a15c"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.690672 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b9dda0-da70-4e7c-850b-de8b7744a15c-kube-api-access-gx24w" (OuterVolumeSpecName: "kube-api-access-gx24w") pod "a5b9dda0-da70-4e7c-850b-de8b7744a15c" (UID: "a5b9dda0-da70-4e7c-850b-de8b7744a15c"). InnerVolumeSpecName "kube-api-access-gx24w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.760312 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 28 13:22:11 crc kubenswrapper[4970]: W1128 13:22:11.782371 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podff17b14e_1701_403e_ae66_0a20d0fc3792.slice/crio-8b67e6542484220ba8a9a6cf8afcd8f63205fa8c8c473a2e7ece2af2cdb1830e WatchSource:0}: Error finding container 8b67e6542484220ba8a9a6cf8afcd8f63205fa8c8c473a2e7ece2af2cdb1830e: Status 404 returned error can't find the container with id 8b67e6542484220ba8a9a6cf8afcd8f63205fa8c8c473a2e7ece2af2cdb1830e Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.784628 4970 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a5b9dda0-da70-4e7c-850b-de8b7744a15c-ready\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.784695 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx24w\" (UniqueName: \"kubernetes.io/projected/a5b9dda0-da70-4e7c-850b-de8b7744a15c-kube-api-access-gx24w\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.784710 4970 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5b9dda0-da70-4e7c-850b-de8b7744a15c-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.784763 4970 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5b9dda0-da70-4e7c-850b-de8b7744a15c-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:11 crc kubenswrapper[4970]: I1128 13:22:11.998094 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 28 13:22:12 crc kubenswrapper[4970]: W1128 13:22:12.098677 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0a609cfd_28dc_440a_8433_6933565864a7.slice/crio-5ef8efbb1e2a9cc1b675f6ce8927e999b4b350693783ccc54b22fb3bbefeccf0 WatchSource:0}: Error finding container 5ef8efbb1e2a9cc1b675f6ce8927e999b4b350693783ccc54b22fb3bbefeccf0: Status 404 returned error can't find the container with id 5ef8efbb1e2a9cc1b675f6ce8927e999b4b350693783ccc54b22fb3bbefeccf0 Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.325734 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0a609cfd-28dc-440a-8433-6933565864a7","Type":"ContainerStarted","Data":"5ef8efbb1e2a9cc1b675f6ce8927e999b4b350693783ccc54b22fb3bbefeccf0"} Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.329179 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ff17b14e-1701-403e-ae66-0a20d0fc3792","Type":"ContainerStarted","Data":"8b67e6542484220ba8a9a6cf8afcd8f63205fa8c8c473a2e7ece2af2cdb1830e"} Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.332025 4970 generic.go:334] "Generic (PLEG): container finished" podID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerID="768eecf438d47e590ee9a995a9acb52f4e1709f5bfc798dbc2349fd8ab5ba11c" exitCode=0 Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.332131 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xkm" event={"ID":"507a4c7b-eff8-4695-a222-3a40b0483eb8","Type":"ContainerDied","Data":"768eecf438d47e590ee9a995a9acb52f4e1709f5bfc798dbc2349fd8ab5ba11c"} Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.334202 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-8btsr_a5b9dda0-da70-4e7c-850b-de8b7744a15c/kube-multus-additional-cni-plugins/0.log" Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.334299 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" event={"ID":"a5b9dda0-da70-4e7c-850b-de8b7744a15c","Type":"ContainerDied","Data":"b44cdf5cbf30e32199221f203c56e2b950666cc11a4487a6c1008683a372cb2a"} Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.334392 4970 scope.go:117] "RemoveContainer" containerID="2fc28451c84d1813a773345a163950b323fe8d1c9f7deb705e1af63c1f469e25" Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.334478 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-8btsr" Nov 28 13:22:12 crc kubenswrapper[4970]: E1128 13:22:12.349102 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wwr74" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" Nov 28 13:22:12 crc kubenswrapper[4970]: E1128 13:22:12.350022 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fp5dk" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.412943 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8btsr"] Nov 28 13:22:12 crc kubenswrapper[4970]: I1128 13:22:12.418158 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-8btsr"] Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.383996 4970 generic.go:334] "Generic (PLEG): container finished" podID="ff17b14e-1701-403e-ae66-0a20d0fc3792" containerID="60c420bfbc300a744284b853099b20104e2c54cf2504cd333fae0f4f7de316eb" exitCode=0 Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.388018 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" path="/var/lib/kubelet/pods/a5b9dda0-da70-4e7c-850b-de8b7744a15c/volumes" Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.388640 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ff17b14e-1701-403e-ae66-0a20d0fc3792","Type":"ContainerDied","Data":"60c420bfbc300a744284b853099b20104e2c54cf2504cd333fae0f4f7de316eb"} Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.388674 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0a609cfd-28dc-440a-8433-6933565864a7","Type":"ContainerStarted","Data":"0ba9afdf36f07a12061629b547653e9030f8dc938cac0fc11c59d04fc0cc587b"} Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.389112 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xkm" event={"ID":"507a4c7b-eff8-4695-a222-3a40b0483eb8","Type":"ContainerStarted","Data":"fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f"} Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.428425 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4xkm" podStartSLOduration=1.70391111 podStartE2EDuration="1m37.42840639s" podCreationTimestamp="2025-11-28 13:20:36 +0000 UTC" firstStartedPulling="2025-11-28 13:20:37.146249806 +0000 UTC m=+47.999131606" lastFinishedPulling="2025-11-28 13:22:12.870745036 +0000 UTC m=+143.723626886" observedRunningTime="2025-11-28 13:22:13.414122585 +0000 UTC m=+144.267004385" watchObservedRunningTime="2025-11-28 13:22:13.42840639 +0000 UTC m=+144.281288190" Nov 28 13:22:13 crc kubenswrapper[4970]: I1128 13:22:13.460985 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=53.460963805 podStartE2EDuration="53.460963805s" podCreationTimestamp="2025-11-28 13:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:22:13.434049936 +0000 UTC m=+144.286931776" watchObservedRunningTime="2025-11-28 13:22:13.460963805 +0000 UTC m=+144.313845605" Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.631236 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.721586 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff17b14e-1701-403e-ae66-0a20d0fc3792-kubelet-dir\") pod \"ff17b14e-1701-403e-ae66-0a20d0fc3792\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.721670 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff17b14e-1701-403e-ae66-0a20d0fc3792-kube-api-access\") pod \"ff17b14e-1701-403e-ae66-0a20d0fc3792\" (UID: \"ff17b14e-1701-403e-ae66-0a20d0fc3792\") " Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.721693 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff17b14e-1701-403e-ae66-0a20d0fc3792-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff17b14e-1701-403e-ae66-0a20d0fc3792" (UID: "ff17b14e-1701-403e-ae66-0a20d0fc3792"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.721859 4970 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff17b14e-1701-403e-ae66-0a20d0fc3792-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.727341 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff17b14e-1701-403e-ae66-0a20d0fc3792-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff17b14e-1701-403e-ae66-0a20d0fc3792" (UID: "ff17b14e-1701-403e-ae66-0a20d0fc3792"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:14 crc kubenswrapper[4970]: I1128 13:22:14.822844 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff17b14e-1701-403e-ae66-0a20d0fc3792-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:15 crc kubenswrapper[4970]: I1128 13:22:15.399892 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ff17b14e-1701-403e-ae66-0a20d0fc3792","Type":"ContainerDied","Data":"8b67e6542484220ba8a9a6cf8afcd8f63205fa8c8c473a2e7ece2af2cdb1830e"} Nov 28 13:22:15 crc kubenswrapper[4970]: I1128 13:22:15.399929 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b67e6542484220ba8a9a6cf8afcd8f63205fa8c8c473a2e7ece2af2cdb1830e" Nov 28 13:22:15 crc kubenswrapper[4970]: I1128 13:22:15.399960 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 28 13:22:16 crc kubenswrapper[4970]: I1128 13:22:16.444114 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:22:16 crc kubenswrapper[4970]: I1128 13:22:16.444450 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:22:16 crc kubenswrapper[4970]: I1128 13:22:16.527848 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:22:17 crc kubenswrapper[4970]: I1128 13:22:17.802728 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdzrp"] Nov 28 13:22:21 crc kubenswrapper[4970]: I1128 13:22:21.333389 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:22:21 crc kubenswrapper[4970]: I1128 13:22:21.333986 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:22:22 crc kubenswrapper[4970]: I1128 13:22:22.436967 4970 generic.go:334] "Generic (PLEG): container finished" podID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerID="488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c" exitCode=0 Nov 28 13:22:22 crc kubenswrapper[4970]: I1128 13:22:22.437054 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hjm2" event={"ID":"ac87c2e3-5a6b-4998-8db7-165e571f6f52","Type":"ContainerDied","Data":"488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c"} Nov 28 13:22:23 crc kubenswrapper[4970]: I1128 13:22:23.448847 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerStarted","Data":"0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413"} Nov 28 13:22:24 crc kubenswrapper[4970]: I1128 13:22:24.455473 4970 generic.go:334] "Generic (PLEG): container finished" podID="fdf78924-9472-414e-baf6-822e511c464c" containerID="0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413" exitCode=0 Nov 28 13:22:24 crc kubenswrapper[4970]: I1128 13:22:24.455548 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerDied","Data":"0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413"} Nov 28 13:22:24 crc kubenswrapper[4970]: I1128 13:22:24.458124 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hjm2" event={"ID":"ac87c2e3-5a6b-4998-8db7-165e571f6f52","Type":"ContainerStarted","Data":"61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a"} Nov 28 13:22:24 crc kubenswrapper[4970]: I1128 13:22:24.492346 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8hjm2" podStartSLOduration=3.79044924 podStartE2EDuration="1m47.492326647s" podCreationTimestamp="2025-11-28 13:20:37 +0000 UTC" firstStartedPulling="2025-11-28 13:20:39.174852885 +0000 UTC m=+50.027734685" lastFinishedPulling="2025-11-28 13:22:22.876730302 +0000 UTC m=+153.729612092" observedRunningTime="2025-11-28 13:22:24.488836628 +0000 UTC m=+155.341718428" watchObservedRunningTime="2025-11-28 13:22:24.492326647 +0000 UTC m=+155.345208457" Nov 28 13:22:25 crc kubenswrapper[4970]: I1128 13:22:25.469073 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerStarted","Data":"1ed8a9fa1ce3fdc311f9a2e6537817b63a75f66ee54ad495c9c87a816e94986a"} Nov 28 13:22:26 crc kubenswrapper[4970]: I1128 13:22:26.479661 4970 generic.go:334] "Generic (PLEG): container finished" podID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerID="1ed8a9fa1ce3fdc311f9a2e6537817b63a75f66ee54ad495c9c87a816e94986a" exitCode=0 Nov 28 13:22:26 crc kubenswrapper[4970]: I1128 13:22:26.479733 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerDied","Data":"1ed8a9fa1ce3fdc311f9a2e6537817b63a75f66ee54ad495c9c87a816e94986a"} Nov 28 13:22:26 crc kubenswrapper[4970]: I1128 13:22:26.484667 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerStarted","Data":"56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45"} Nov 28 13:22:26 crc kubenswrapper[4970]: I1128 13:22:26.492555 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:22:26 crc kubenswrapper[4970]: I1128 13:22:26.516037 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mwg2n" podStartSLOduration=3.304134968 podStartE2EDuration="1m51.516022397s" podCreationTimestamp="2025-11-28 13:20:35 +0000 UTC" firstStartedPulling="2025-11-28 13:20:37.147659247 +0000 UTC m=+48.000541047" lastFinishedPulling="2025-11-28 13:22:25.359546656 +0000 UTC m=+156.212428476" observedRunningTime="2025-11-28 13:22:26.512027793 +0000 UTC m=+157.364909593" watchObservedRunningTime="2025-11-28 13:22:26.516022397 +0000 UTC m=+157.368904197" Nov 28 13:22:27 crc kubenswrapper[4970]: I1128 13:22:27.828720 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:22:27 crc kubenswrapper[4970]: I1128 13:22:27.828787 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:22:27 crc kubenswrapper[4970]: I1128 13:22:27.885635 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:22:28 crc kubenswrapper[4970]: I1128 13:22:28.548984 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:22:29 crc kubenswrapper[4970]: I1128 13:22:29.486130 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4xkm"] Nov 28 13:22:29 crc kubenswrapper[4970]: I1128 13:22:29.486391 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4xkm" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="registry-server" containerID="cri-o://fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f" gracePeriod=2 Nov 28 13:22:35 crc kubenswrapper[4970]: I1128 13:22:35.542199 4970 generic.go:334] "Generic (PLEG): container finished" podID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerID="fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f" exitCode=0 Nov 28 13:22:35 crc kubenswrapper[4970]: I1128 13:22:35.542269 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xkm" event={"ID":"507a4c7b-eff8-4695-a222-3a40b0483eb8","Type":"ContainerDied","Data":"fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f"} Nov 28 13:22:36 crc kubenswrapper[4970]: I1128 13:22:36.048751 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:22:36 crc kubenswrapper[4970]: I1128 13:22:36.048847 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:22:36 crc kubenswrapper[4970]: I1128 13:22:36.118168 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:22:36 crc kubenswrapper[4970]: E1128 13:22:36.444481 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f is running failed: container process not found" containerID="fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:22:36 crc kubenswrapper[4970]: E1128 13:22:36.445038 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f is running failed: container process not found" containerID="fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:22:36 crc kubenswrapper[4970]: E1128 13:22:36.445524 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f is running failed: container process not found" containerID="fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:22:36 crc kubenswrapper[4970]: E1128 13:22:36.445631 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-l4xkm" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="registry-server" Nov 28 13:22:36 crc kubenswrapper[4970]: I1128 13:22:36.609727 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:22:42 crc kubenswrapper[4970]: I1128 13:22:42.836026 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" podUID="65101460-48b8-4bd6-82b0-4f5bd4254ec5" containerName="oauth-openshift" containerID="cri-o://9c6804f75305b0a57406da7538db40ab4eb23ecbedfba98d8420b87f8bd0bff0" gracePeriod=15 Nov 28 13:22:44 crc kubenswrapper[4970]: I1128 13:22:44.609239 4970 generic.go:334] "Generic (PLEG): container finished" podID="65101460-48b8-4bd6-82b0-4f5bd4254ec5" containerID="9c6804f75305b0a57406da7538db40ab4eb23ecbedfba98d8420b87f8bd0bff0" exitCode=0 Nov 28 13:22:44 crc kubenswrapper[4970]: I1128 13:22:44.609348 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" event={"ID":"65101460-48b8-4bd6-82b0-4f5bd4254ec5","Type":"ContainerDied","Data":"9c6804f75305b0a57406da7538db40ab4eb23ecbedfba98d8420b87f8bd0bff0"} Nov 28 13:22:44 crc kubenswrapper[4970]: I1128 13:22:44.997985 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.046819 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-utilities\") pod \"507a4c7b-eff8-4695-a222-3a40b0483eb8\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.046919 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-644lr\" (UniqueName: \"kubernetes.io/projected/507a4c7b-eff8-4695-a222-3a40b0483eb8-kube-api-access-644lr\") pod \"507a4c7b-eff8-4695-a222-3a40b0483eb8\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.047057 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-catalog-content\") pod \"507a4c7b-eff8-4695-a222-3a40b0483eb8\" (UID: \"507a4c7b-eff8-4695-a222-3a40b0483eb8\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.047663 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-utilities" (OuterVolumeSpecName: "utilities") pod "507a4c7b-eff8-4695-a222-3a40b0483eb8" (UID: "507a4c7b-eff8-4695-a222-3a40b0483eb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.050677 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.052394 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507a4c7b-eff8-4695-a222-3a40b0483eb8-kube-api-access-644lr" (OuterVolumeSpecName: "kube-api-access-644lr") pod "507a4c7b-eff8-4695-a222-3a40b0483eb8" (UID: "507a4c7b-eff8-4695-a222-3a40b0483eb8"). InnerVolumeSpecName "kube-api-access-644lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.098125 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "507a4c7b-eff8-4695-a222-3a40b0483eb8" (UID: "507a4c7b-eff8-4695-a222-3a40b0483eb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148034 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-trusted-ca-bundle\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148088 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-dir\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148126 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-ocp-branding-template\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148159 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-router-certs\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148162 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148185 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-policies\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148363 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-error\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148396 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-login\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148447 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-session\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148466 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-serving-cert\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148482 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-provider-selection\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148522 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-service-ca\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148576 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-cliconfig\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148622 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-idp-0-file-data\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148654 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tmnn\" (UniqueName: \"kubernetes.io/projected/65101460-48b8-4bd6-82b0-4f5bd4254ec5-kube-api-access-7tmnn\") pod \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\" (UID: \"65101460-48b8-4bd6-82b0-4f5bd4254ec5\") " Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.148856 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149079 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149229 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149415 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149437 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149451 4970 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149461 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/507a4c7b-eff8-4695-a222-3a40b0483eb8-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149471 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-644lr\" (UniqueName: \"kubernetes.io/projected/507a4c7b-eff8-4695-a222-3a40b0483eb8-kube-api-access-644lr\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149480 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149488 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.149806 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.151732 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.152184 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.152252 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65101460-48b8-4bd6-82b0-4f5bd4254ec5-kube-api-access-7tmnn" (OuterVolumeSpecName: "kube-api-access-7tmnn") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "kube-api-access-7tmnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.152365 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.152655 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.152749 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.152932 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.153456 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.161441 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "65101460-48b8-4bd6-82b0-4f5bd4254ec5" (UID: "65101460-48b8-4bd6-82b0-4f5bd4254ec5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251056 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251103 4970 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/65101460-48b8-4bd6-82b0-4f5bd4254ec5-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251127 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251144 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251161 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251189 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251210 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251249 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251266 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tmnn\" (UniqueName: \"kubernetes.io/projected/65101460-48b8-4bd6-82b0-4f5bd4254ec5-kube-api-access-7tmnn\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.251282 4970 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/65101460-48b8-4bd6-82b0-4f5bd4254ec5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.617110 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" event={"ID":"65101460-48b8-4bd6-82b0-4f5bd4254ec5","Type":"ContainerDied","Data":"f71cce2f30392cf6a3cb448ff268b7246b2baea435976519788ed6b8aac29e78"} Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.617138 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdzrp" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.617672 4970 scope.go:117] "RemoveContainer" containerID="9c6804f75305b0a57406da7538db40ab4eb23ecbedfba98d8420b87f8bd0bff0" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.623745 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xkm" event={"ID":"507a4c7b-eff8-4695-a222-3a40b0483eb8","Type":"ContainerDied","Data":"f61c5c9e287118e35418471d83ff2cda4c01998b8748e8aa88947f45d43a5a40"} Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.624431 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xkm" Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.646640 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdzrp"] Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.650539 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdzrp"] Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.665140 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4xkm"] Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.668996 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4xkm"] Nov 28 13:22:45 crc kubenswrapper[4970]: I1128 13:22:45.983707 4970 scope.go:117] "RemoveContainer" containerID="fe59cc14c4a43d3af87d4897d83c5eca2f2073278a22a1933e67f25c0c08b56f" Nov 28 13:22:46 crc kubenswrapper[4970]: I1128 13:22:46.250478 4970 scope.go:117] "RemoveContainer" containerID="768eecf438d47e590ee9a995a9acb52f4e1709f5bfc798dbc2349fd8ab5ba11c" Nov 28 13:22:46 crc kubenswrapper[4970]: I1128 13:22:46.864195 4970 scope.go:117] "RemoveContainer" containerID="22190e6d1ceed4e3ebb077d9a6db4582561199a9eeb757785c712980390d51da" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.286501 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-wnpt6"] Nov 28 13:22:47 crc kubenswrapper[4970]: E1128 13:22:47.287101 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="registry-server" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287128 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="registry-server" Nov 28 13:22:47 crc kubenswrapper[4970]: E1128 13:22:47.287147 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="extract-content" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287161 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="extract-content" Nov 28 13:22:47 crc kubenswrapper[4970]: E1128 13:22:47.287188 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287200 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:22:47 crc kubenswrapper[4970]: E1128 13:22:47.287244 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65101460-48b8-4bd6-82b0-4f5bd4254ec5" containerName="oauth-openshift" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287258 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="65101460-48b8-4bd6-82b0-4f5bd4254ec5" containerName="oauth-openshift" Nov 28 13:22:47 crc kubenswrapper[4970]: E1128 13:22:47.287280 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="extract-utilities" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287293 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="extract-utilities" Nov 28 13:22:47 crc kubenswrapper[4970]: E1128 13:22:47.287313 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff17b14e-1701-403e-ae66-0a20d0fc3792" containerName="pruner" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287325 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff17b14e-1701-403e-ae66-0a20d0fc3792" containerName="pruner" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287483 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff17b14e-1701-403e-ae66-0a20d0fc3792" containerName="pruner" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287504 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" containerName="registry-server" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287528 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b9dda0-da70-4e7c-850b-de8b7744a15c" containerName="kube-multus-additional-cni-plugins" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.287547 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="65101460-48b8-4bd6-82b0-4f5bd4254ec5" containerName="oauth-openshift" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.288118 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.293190 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.293290 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.293475 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.293492 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.296380 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.297199 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.297586 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.297978 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.298254 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.298406 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.302381 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.303676 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.309205 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-wnpt6"] Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.311340 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.316355 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.324579 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381067 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381129 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381154 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf8c2f64-bd1b-4883-a85a-a275da8968c3-audit-dir\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381176 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ds7\" (UniqueName: \"kubernetes.io/projected/bf8c2f64-bd1b-4883-a85a-a275da8968c3-kube-api-access-88ds7\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381206 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381246 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381373 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381450 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381505 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381529 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381615 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-audit-policies\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381691 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381740 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.381812 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.394443 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507a4c7b-eff8-4695-a222-3a40b0483eb8" path="/var/lib/kubelet/pods/507a4c7b-eff8-4695-a222-3a40b0483eb8/volumes" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.395188 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65101460-48b8-4bd6-82b0-4f5bd4254ec5" path="/var/lib/kubelet/pods/65101460-48b8-4bd6-82b0-4f5bd4254ec5/volumes" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483506 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483633 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483683 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-audit-policies\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483727 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483759 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483807 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483853 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483918 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483955 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf8c2f64-bd1b-4883-a85a-a275da8968c3-audit-dir\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.483996 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ds7\" (UniqueName: \"kubernetes.io/projected/bf8c2f64-bd1b-4883-a85a-a275da8968c3-kube-api-access-88ds7\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.484044 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.484085 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.484118 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.484151 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.484444 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf8c2f64-bd1b-4883-a85a-a275da8968c3-audit-dir\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.485796 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.485881 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-audit-policies\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.486801 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.487460 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.490602 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.490735 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.491824 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.492040 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.492321 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.492324 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.492429 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.492566 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf8c2f64-bd1b-4883-a85a-a275da8968c3-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.503049 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ds7\" (UniqueName: \"kubernetes.io/projected/bf8c2f64-bd1b-4883-a85a-a275da8968c3-kube-api-access-88ds7\") pod \"oauth-openshift-9565f95f5-wnpt6\" (UID: \"bf8c2f64-bd1b-4883-a85a-a275da8968c3\") " pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.601744 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.648509 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerStarted","Data":"49063d2a8e369100f42c57c9f14ee6d892d09e8e1f031f7b9fccc09f1ac7be64"} Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.652644 4970 generic.go:334] "Generic (PLEG): container finished" podID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerID="099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826" exitCode=0 Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.652688 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jvzw" event={"ID":"ea4c7183-d326-44cd-8e40-649e3dad901e","Type":"ContainerDied","Data":"099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826"} Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.655675 4970 generic.go:334] "Generic (PLEG): container finished" podID="619af67d-331c-4b38-b536-269ba823fd75" containerID="a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d" exitCode=0 Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.655722 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcv9n" event={"ID":"619af67d-331c-4b38-b536-269ba823fd75","Type":"ContainerDied","Data":"a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d"} Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.664131 4970 generic.go:334] "Generic (PLEG): container finished" podID="4cae710a-4284-4d76-b507-d7aa55adba72" containerID="199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4" exitCode=0 Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.664203 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzvwp" event={"ID":"4cae710a-4284-4d76-b507-d7aa55adba72","Type":"ContainerDied","Data":"199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4"} Nov 28 13:22:47 crc kubenswrapper[4970]: I1128 13:22:47.677399 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fp5dk" podStartSLOduration=3.003721638 podStartE2EDuration="2m9.677381023s" podCreationTimestamp="2025-11-28 13:20:38 +0000 UTC" firstStartedPulling="2025-11-28 13:20:40.191656167 +0000 UTC m=+51.044537967" lastFinishedPulling="2025-11-28 13:22:46.865315552 +0000 UTC m=+177.718197352" observedRunningTime="2025-11-28 13:22:47.670513999 +0000 UTC m=+178.523395799" watchObservedRunningTime="2025-11-28 13:22:47.677381023 +0000 UTC m=+178.530262823" Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.045870 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-wnpt6"] Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.672110 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jvzw" event={"ID":"ea4c7183-d326-44cd-8e40-649e3dad901e","Type":"ContainerStarted","Data":"8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a"} Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.673979 4970 generic.go:334] "Generic (PLEG): container finished" podID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerID="086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69" exitCode=0 Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.674098 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwr74" event={"ID":"a92f4929-dcab-4362-a5f2-c648f274bf04","Type":"ContainerDied","Data":"086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69"} Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.675613 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" event={"ID":"bf8c2f64-bd1b-4883-a85a-a275da8968c3","Type":"ContainerStarted","Data":"a0781af1489efa18b249fb64c9fec48ed85ba21243082536de0288f1c950f484"} Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.675662 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" event={"ID":"bf8c2f64-bd1b-4883-a85a-a275da8968c3","Type":"ContainerStarted","Data":"926f0ffb211a6b719111ee83c7297baf0ea35692c3144e4a0d0b442542139d16"} Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.675685 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.678253 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcv9n" event={"ID":"619af67d-331c-4b38-b536-269ba823fd75","Type":"ContainerStarted","Data":"372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa"} Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.680244 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzvwp" event={"ID":"4cae710a-4284-4d76-b507-d7aa55adba72","Type":"ContainerStarted","Data":"4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc"} Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.700401 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jvzw" podStartSLOduration=2.711718152 podStartE2EDuration="2m13.700376554s" podCreationTimestamp="2025-11-28 13:20:35 +0000 UTC" firstStartedPulling="2025-11-28 13:20:37.151872341 +0000 UTC m=+48.004754141" lastFinishedPulling="2025-11-28 13:22:48.140530743 +0000 UTC m=+178.993412543" observedRunningTime="2025-11-28 13:22:48.693422188 +0000 UTC m=+179.546303988" watchObservedRunningTime="2025-11-28 13:22:48.700376554 +0000 UTC m=+179.553258374" Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.733530 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nzvwp" podStartSLOduration=2.7214388229999997 podStartE2EDuration="2m11.733509097s" podCreationTimestamp="2025-11-28 13:20:37 +0000 UTC" firstStartedPulling="2025-11-28 13:20:39.181353667 +0000 UTC m=+50.034235467" lastFinishedPulling="2025-11-28 13:22:48.193423931 +0000 UTC m=+179.046305741" observedRunningTime="2025-11-28 13:22:48.73201857 +0000 UTC m=+179.584900370" watchObservedRunningTime="2025-11-28 13:22:48.733509097 +0000 UTC m=+179.586390907" Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.751839 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jcv9n" podStartSLOduration=3.7331150429999997 podStartE2EDuration="2m14.751821197s" podCreationTimestamp="2025-11-28 13:20:34 +0000 UTC" firstStartedPulling="2025-11-28 13:20:37.14368076 +0000 UTC m=+47.996562560" lastFinishedPulling="2025-11-28 13:22:48.162386914 +0000 UTC m=+179.015268714" observedRunningTime="2025-11-28 13:22:48.749940889 +0000 UTC m=+179.602822709" watchObservedRunningTime="2025-11-28 13:22:48.751821197 +0000 UTC m=+179.604702997" Nov 28 13:22:48 crc kubenswrapper[4970]: I1128 13:22:48.773368 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" podStartSLOduration=31.773354718 podStartE2EDuration="31.773354718s" podCreationTimestamp="2025-11-28 13:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:22:48.77052632 +0000 UTC m=+179.623408120" watchObservedRunningTime="2025-11-28 13:22:48.773354718 +0000 UTC m=+179.626236508" Nov 28 13:22:49 crc kubenswrapper[4970]: I1128 13:22:49.165197 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9565f95f5-wnpt6" Nov 28 13:22:49 crc kubenswrapper[4970]: I1128 13:22:49.236167 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:22:49 crc kubenswrapper[4970]: I1128 13:22:49.237405 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:22:49 crc kubenswrapper[4970]: I1128 13:22:49.687733 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwr74" event={"ID":"a92f4929-dcab-4362-a5f2-c648f274bf04","Type":"ContainerStarted","Data":"e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75"} Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.129551 4970 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.130393 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.130678 4970 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.131418 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9" gracePeriod=15 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.131491 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0" gracePeriod=15 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.131608 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c" gracePeriod=15 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.131836 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374" gracePeriod=15 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.131587 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84" gracePeriod=15 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132466 4970 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132764 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132784 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132807 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132820 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132841 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132854 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132870 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132881 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132900 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132912 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132930 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132942 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.132957 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.132968 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.133133 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.133148 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.133163 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.133176 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.133193 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.133211 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218496 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218546 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218585 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218614 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218632 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218659 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218718 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.218740 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.283703 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fp5dk" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="registry-server" probeResult="failure" output=< Nov 28 13:22:50 crc kubenswrapper[4970]: timeout: failed to connect service ":50051" within 1s Nov 28 13:22:50 crc kubenswrapper[4970]: > Nov 28 13:22:50 crc kubenswrapper[4970]: E1128 13:22:50.284504 4970 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event=< Nov 28 13:22:50 crc kubenswrapper[4970]: &Event{ObjectMeta:{redhat-operators-fp5dk.187c2e662622f60d openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-fp5dk,UID:7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44,APIVersion:v1,ResourceVersion:28498,FieldPath:spec.containers{registry-server},},Reason:Unhealthy,Message:Startup probe failed: timeout: failed to connect service ":50051" within 1s Nov 28 13:22:50 crc kubenswrapper[4970]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:22:50.283759117 +0000 UTC m=+181.136640917,LastTimestamp:2025-11-28 13:22:50.283759117 +0000 UTC m=+181.136640917,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Nov 28 13:22:50 crc kubenswrapper[4970]: > Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320292 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320331 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320357 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320378 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320395 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320415 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320451 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320444 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320468 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320482 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320491 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320464 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320501 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320529 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320533 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.320631 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.693492 4970 generic.go:334] "Generic (PLEG): container finished" podID="0a609cfd-28dc-440a-8433-6933565864a7" containerID="0ba9afdf36f07a12061629b547653e9030f8dc938cac0fc11c59d04fc0cc587b" exitCode=0 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.693582 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0a609cfd-28dc-440a-8433-6933565864a7","Type":"ContainerDied","Data":"0ba9afdf36f07a12061629b547653e9030f8dc938cac0fc11c59d04fc0cc587b"} Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.694309 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.694627 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.695391 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.697845 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.698549 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0" exitCode=0 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.698576 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84" exitCode=0 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.698592 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c" exitCode=0 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.698607 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374" exitCode=2 Nov 28 13:22:50 crc kubenswrapper[4970]: I1128 13:22:50.698672 4970 scope.go:117] "RemoveContainer" containerID="89211b552b7e2893a562bd785f35642f3b6792b0a123133d459a728b9ca6f5f3" Nov 28 13:22:51 crc kubenswrapper[4970]: I1128 13:22:51.183397 4970 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Nov 28 13:22:51 crc kubenswrapper[4970]: I1128 13:22:51.183818 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Nov 28 13:22:51 crc kubenswrapper[4970]: I1128 13:22:51.334117 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:22:51 crc kubenswrapper[4970]: I1128 13:22:51.334251 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:22:51 crc kubenswrapper[4970]: I1128 13:22:51.710294 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.006086 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.007884 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.146098 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a609cfd-28dc-440a-8433-6933565864a7-kube-api-access\") pod \"0a609cfd-28dc-440a-8433-6933565864a7\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.146201 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-var-lock\") pod \"0a609cfd-28dc-440a-8433-6933565864a7\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.146275 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-kubelet-dir\") pod \"0a609cfd-28dc-440a-8433-6933565864a7\" (UID: \"0a609cfd-28dc-440a-8433-6933565864a7\") " Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.146457 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0a609cfd-28dc-440a-8433-6933565864a7" (UID: "0a609cfd-28dc-440a-8433-6933565864a7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.146546 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-var-lock" (OuterVolumeSpecName: "var-lock") pod "0a609cfd-28dc-440a-8433-6933565864a7" (UID: "0a609cfd-28dc-440a-8433-6933565864a7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.147042 4970 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-var-lock\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.147073 4970 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a609cfd-28dc-440a-8433-6933565864a7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.151196 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a609cfd-28dc-440a-8433-6933565864a7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0a609cfd-28dc-440a-8433-6933565864a7" (UID: "0a609cfd-28dc-440a-8433-6933565864a7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.249964 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a609cfd-28dc-440a-8433-6933565864a7-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.717898 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0a609cfd-28dc-440a-8433-6933565864a7","Type":"ContainerDied","Data":"5ef8efbb1e2a9cc1b675f6ce8927e999b4b350693783ccc54b22fb3bbefeccf0"} Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.718184 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef8efbb1e2a9cc1b675f6ce8927e999b4b350693783ccc54b22fb3bbefeccf0" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.717913 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.721057 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.721773 4970 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9" exitCode=0 Nov 28 13:22:52 crc kubenswrapper[4970]: I1128 13:22:52.730162 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.076204 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.076767 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.138704 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.139379 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.139928 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: E1128 13:22:55.161980 4970 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.162465 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:55 crc kubenswrapper[4970]: W1128 13:22:55.196615 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5725cbcf5ff8343106b53652b35633984ba155c408fc1d953d7f6a476ee6b5f9 WatchSource:0}: Error finding container 5725cbcf5ff8343106b53652b35633984ba155c408fc1d953d7f6a476ee6b5f9: Status 404 returned error can't find the container with id 5725cbcf5ff8343106b53652b35633984ba155c408fc1d953d7f6a476ee6b5f9 Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.348371 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.349697 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.350231 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.350590 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.350899 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.457959 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.458039 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492113 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492237 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492249 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492320 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492384 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492525 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492598 4970 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.492614 4970 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.494576 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.495021 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.495371 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.495595 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.593858 4970 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.746585 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.747660 4970 scope.go:117] "RemoveContainer" containerID="478a53f5d28b31054e796d95569a25b33787f3d2562ca562603b609f530e95d0" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.747671 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.748329 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.748564 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.748919 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.749304 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.750503 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96"} Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.750574 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5725cbcf5ff8343106b53652b35633984ba155c408fc1d953d7f6a476ee6b5f9"} Nov 28 13:22:55 crc kubenswrapper[4970]: E1128 13:22:55.751433 4970 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.751662 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.752144 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.752539 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.752998 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.760019 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.760237 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.760428 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.760614 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.770155 4970 scope.go:117] "RemoveContainer" containerID="6d5b3592754b29f4aebbcfcdd7d4f38b10e70438680d2f9d86409bd2af4c1b84" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.790444 4970 scope.go:117] "RemoveContainer" containerID="1886c741d951051096d9345e87c6c1386f48b9576ae2d6ed095a79be42c5297c" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.795329 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.795843 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.796378 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.796714 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.797040 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.811087 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.811671 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.811999 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.812404 4970 scope.go:117] "RemoveContainer" containerID="7c49f9caf580dac97faa02766cf5d85c9483f0b201e1987c285ad9aa45178374" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.812478 4970 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.812733 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.831787 4970 scope.go:117] "RemoveContainer" containerID="db18f7f1730110037ceac0335bfd4c4176cd8e6b4f3aeb9f7caacaeb82c0a9e9" Nov 28 13:22:55 crc kubenswrapper[4970]: I1128 13:22:55.856290 4970 scope.go:117] "RemoveContainer" containerID="9535aede1ed0a237e5190688ad4f4177d2a18a1bde3c6fb051dd71fdbc61ed52" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.231572 4970 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.232683 4970 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.233541 4970 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.234131 4970 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.234653 4970 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:57 crc kubenswrapper[4970]: I1128 13:22:57.234719 4970 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.235113 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Nov 28 13:22:57 crc kubenswrapper[4970]: I1128 13:22:57.398485 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.436948 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Nov 28 13:22:57 crc kubenswrapper[4970]: E1128 13:22:57.837960 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.232542 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.232691 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.286875 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.287732 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.288274 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.288969 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.289409 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: E1128 13:22:58.639619 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.842290 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.842487 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.843721 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.844446 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.844916 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.845517 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.846199 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.885189 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.885826 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.886192 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.886632 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.886855 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:58 crc kubenswrapper[4970]: I1128 13:22:58.887141 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.289620 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.290202 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.290764 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.291191 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.291600 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.292027 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.292491 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.351986 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.352602 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.353110 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.353672 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.354046 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.354447 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.354812 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.387168 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.387719 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.388804 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.389197 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.389650 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.389894 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: E1128 13:22:59.742490 4970 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event=< Nov 28 13:22:59 crc kubenswrapper[4970]: &Event{ObjectMeta:{redhat-operators-fp5dk.187c2e662622f60d openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-fp5dk,UID:7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44,APIVersion:v1,ResourceVersion:28498,FieldPath:spec.containers{registry-server},},Reason:Unhealthy,Message:Startup probe failed: timeout: failed to connect service ":50051" within 1s Nov 28 13:22:59 crc kubenswrapper[4970]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-28 13:22:50.283759117 +0000 UTC m=+181.136640917,LastTimestamp:2025-11-28 13:22:50.283759117 +0000 UTC m=+181.136640917,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Nov 28 13:22:59 crc kubenswrapper[4970]: > Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.826555 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.827568 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.827927 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.828677 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.829351 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.830006 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:22:59 crc kubenswrapper[4970]: I1128 13:22:59.830300 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:00 crc kubenswrapper[4970]: E1128 13:23:00.240914 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.381029 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.382561 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.382974 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.383554 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.384397 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.384924 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.385330 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.405553 4970 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.405598 4970 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:01 crc kubenswrapper[4970]: E1128 13:23:01.405949 4970 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.406868 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:01 crc kubenswrapper[4970]: W1128 13:23:01.446348 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-db76dae78a787d3ffa66514a58004651bc0fb39b7e3fdebd99236fb11bd33165 WatchSource:0}: Error finding container db76dae78a787d3ffa66514a58004651bc0fb39b7e3fdebd99236fb11bd33165: Status 404 returned error can't find the container with id db76dae78a787d3ffa66514a58004651bc0fb39b7e3fdebd99236fb11bd33165 Nov 28 13:23:01 crc kubenswrapper[4970]: I1128 13:23:01.795794 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"db76dae78a787d3ffa66514a58004651bc0fb39b7e3fdebd99236fb11bd33165"} Nov 28 13:23:03 crc kubenswrapper[4970]: E1128 13:23:03.442419 4970 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="6.4s" Nov 28 13:23:04 crc kubenswrapper[4970]: I1128 13:23:04.823528 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b0915d1e1c24fdaf6a5ccec587917ced52b84c26e5c892491b9d991578bad354"} Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.837847 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.837946 4970 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="33cbcaa6257083258112041892758c528dc9c9c165831253ec7892e8c7e2b451" exitCode=1 Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.837992 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"33cbcaa6257083258112041892758c528dc9c9c165831253ec7892e8c7e2b451"} Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.838717 4970 scope.go:117] "RemoveContainer" containerID="33cbcaa6257083258112041892758c528dc9c9c165831253ec7892e8c7e2b451" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.839203 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.839712 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.840070 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.840658 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.841162 4970 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.841756 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:05 crc kubenswrapper[4970]: I1128 13:23:05.842412 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.843735 4970 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.844078 4970 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.844255 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: E1128 13:23:06.844439 4970 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.844741 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.845087 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.845441 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.845752 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.845968 4970 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:06 crc kubenswrapper[4970]: I1128 13:23:06.846168 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.853458 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.853911 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e77e2c3f5e0191cc6d54f80982a7cb1405e59eec685013525304421b349ebbb7"} Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.854922 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.855171 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.855416 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.855586 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.855745 4970 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.855921 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.856131 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.856809 4970 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b0915d1e1c24fdaf6a5ccec587917ced52b84c26e5c892491b9d991578bad354" exitCode=0 Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.856839 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b0915d1e1c24fdaf6a5ccec587917ced52b84c26e5c892491b9d991578bad354"} Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.857053 4970 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.857071 4970 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:07 crc kubenswrapper[4970]: E1128 13:23:07.857382 4970 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.857401 4970 status_manager.go:851] "Failed to get status for pod" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" pod="openshift-marketplace/certified-operators-5jvzw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5jvzw\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.857564 4970 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.857706 4970 status_manager.go:851] "Failed to get status for pod" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" pod="openshift-marketplace/redhat-operators-wwr74" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wwr74\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.857843 4970 status_manager.go:851] "Failed to get status for pod" podUID="0a609cfd-28dc-440a-8433-6933565864a7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.858033 4970 status_manager.go:851] "Failed to get status for pod" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" pod="openshift-marketplace/redhat-marketplace-nzvwp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nzvwp\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.858279 4970 status_manager.go:851] "Failed to get status for pod" podUID="619af67d-331c-4b38-b536-269ba823fd75" pod="openshift-marketplace/certified-operators-jcv9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-jcv9n\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:07 crc kubenswrapper[4970]: I1128 13:23:07.858549 4970 status_manager.go:851] "Failed to get status for pod" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" pod="openshift-marketplace/redhat-operators-fp5dk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fp5dk\": dial tcp 38.102.83.212:6443: connect: connection refused" Nov 28 13:23:08 crc kubenswrapper[4970]: I1128 13:23:08.873023 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4d924d7add0d7743ce811d880729ae39b4f0da3cc9dae6409625cb8a74fa3a8c"} Nov 28 13:23:08 crc kubenswrapper[4970]: I1128 13:23:08.873506 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c8c540c4ff9df401abf55ec48b5b424099e2e37bdbd9e8fa4c14466f00a6032"} Nov 28 13:23:08 crc kubenswrapper[4970]: I1128 13:23:08.873517 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"907171856b6d420907d92b95b46f0727a287115ca8b9c3402c85a3f6175b3cee"} Nov 28 13:23:09 crc kubenswrapper[4970]: I1128 13:23:09.881513 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3166dc607114356cc0ba813342fb117dffd3c02690d355321e5b9877fba157c3"} Nov 28 13:23:09 crc kubenswrapper[4970]: I1128 13:23:09.881553 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7cbcff15e6f1073d1ecb0b513b21f1d52f7b97b1fed3536b1b13aaf410ab9bfa"} Nov 28 13:23:09 crc kubenswrapper[4970]: I1128 13:23:09.881677 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:09 crc kubenswrapper[4970]: I1128 13:23:09.881794 4970 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:09 crc kubenswrapper[4970]: I1128 13:23:09.881813 4970 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:11 crc kubenswrapper[4970]: I1128 13:23:11.407325 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:11 crc kubenswrapper[4970]: I1128 13:23:11.407380 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:11 crc kubenswrapper[4970]: I1128 13:23:11.416917 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:12 crc kubenswrapper[4970]: I1128 13:23:12.662125 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:12 crc kubenswrapper[4970]: I1128 13:23:12.669543 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:12 crc kubenswrapper[4970]: I1128 13:23:12.899430 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:14 crc kubenswrapper[4970]: I1128 13:23:14.887783 4970 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:14 crc kubenswrapper[4970]: I1128 13:23:14.909928 4970 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:14 crc kubenswrapper[4970]: I1128 13:23:14.909961 4970 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:14 crc kubenswrapper[4970]: I1128 13:23:14.913315 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:14 crc kubenswrapper[4970]: I1128 13:23:14.914968 4970 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5ecd43be-5027-4806-8e2b-cf9c71ebb44c" Nov 28 13:23:15 crc kubenswrapper[4970]: I1128 13:23:15.915599 4970 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:15 crc kubenswrapper[4970]: I1128 13:23:15.915951 4970 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa8646b0-d3c4-45ac-9de5-c342099d5515" Nov 28 13:23:19 crc kubenswrapper[4970]: I1128 13:23:19.431000 4970 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5ecd43be-5027-4806-8e2b-cf9c71ebb44c" Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.333544 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.333647 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.333726 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.334696 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc9b6fc184f5dc3ba36a264ad6b3b87d8306222016e8b9eab63d75530062a2bd"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.334901 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://fc9b6fc184f5dc3ba36a264ad6b3b87d8306222016e8b9eab63d75530062a2bd" gracePeriod=600 Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.954155 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="fc9b6fc184f5dc3ba36a264ad6b3b87d8306222016e8b9eab63d75530062a2bd" exitCode=0 Nov 28 13:23:21 crc kubenswrapper[4970]: I1128 13:23:21.954242 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"fc9b6fc184f5dc3ba36a264ad6b3b87d8306222016e8b9eab63d75530062a2bd"} Nov 28 13:23:22 crc kubenswrapper[4970]: I1128 13:23:22.968883 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"37f88618e3e0c64d996d73cc9caf51cfb18f91db50c0f6d5a80a21593f745369"} Nov 28 13:23:23 crc kubenswrapper[4970]: I1128 13:23:23.967405 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 28 13:23:24 crc kubenswrapper[4970]: I1128 13:23:24.660516 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 28 13:23:25 crc kubenswrapper[4970]: I1128 13:23:25.184192 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 28 13:23:25 crc kubenswrapper[4970]: I1128 13:23:25.367557 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 28 13:23:25 crc kubenswrapper[4970]: I1128 13:23:25.635791 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.153914 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.273364 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.315753 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.582485 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.625825 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.644523 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.711882 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 28 13:23:26 crc kubenswrapper[4970]: I1128 13:23:26.850344 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.055814 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.238563 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.494291 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.617598 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.663052 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.742541 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.789681 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.791259 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.857514 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.859389 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 28 13:23:27 crc kubenswrapper[4970]: I1128 13:23:27.972102 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.062906 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.164550 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.200564 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.211761 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.234731 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.248968 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.316625 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.326409 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.400545 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.404356 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.420101 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.551350 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.596990 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.706971 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.810796 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.943362 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 28 13:23:28 crc kubenswrapper[4970]: I1128 13:23:28.949308 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.061853 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.143276 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.325587 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.420246 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.424452 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.755195 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.775153 4970 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.787358 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.834504 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.868501 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.887660 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.953599 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.962056 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 28 13:23:29 crc kubenswrapper[4970]: I1128 13:23:29.998658 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.091513 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.227061 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.294440 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.308916 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.328425 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.431523 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.462457 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.619321 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.659751 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.707505 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.762330 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.815709 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.823237 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 28 13:23:30 crc kubenswrapper[4970]: I1128 13:23:30.857787 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.118325 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.142834 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.275014 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.308083 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.342694 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.351482 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.397728 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.435432 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.486742 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.611743 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.662520 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.676501 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.800570 4970 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.803873 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wwr74" podStartSLOduration=44.898402097 podStartE2EDuration="2m53.803857009s" podCreationTimestamp="2025-11-28 13:20:38 +0000 UTC" firstStartedPulling="2025-11-28 13:20:40.193886193 +0000 UTC m=+51.046767993" lastFinishedPulling="2025-11-28 13:22:49.099341115 +0000 UTC m=+179.952222905" observedRunningTime="2025-11-28 13:22:49.707256365 +0000 UTC m=+180.560138195" watchObservedRunningTime="2025-11-28 13:23:31.803857009 +0000 UTC m=+222.656738809" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.805581 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.805624 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.810596 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.829521 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.829489195 podStartE2EDuration="17.829489195s" podCreationTimestamp="2025-11-28 13:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:23:31.82410954 +0000 UTC m=+222.676991390" watchObservedRunningTime="2025-11-28 13:23:31.829489195 +0000 UTC m=+222.682371085" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.848679 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.857236 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.881348 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 28 13:23:31 crc kubenswrapper[4970]: I1128 13:23:31.990634 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.048119 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.062882 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.182654 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.250541 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.299495 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.329614 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.337063 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.351617 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.577826 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.583705 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.685468 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.718398 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.793595 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.829312 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.876778 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.913009 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 28 13:23:32 crc kubenswrapper[4970]: I1128 13:23:32.957659 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.078877 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.152478 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.239164 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.294921 4970 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.351777 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.457822 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.607646 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.798355 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.855691 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 28 13:23:33 crc kubenswrapper[4970]: I1128 13:23:33.924738 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.097608 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.098904 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.105687 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.105874 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.121793 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.141125 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.170795 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.277462 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.379570 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.438874 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.495020 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.538976 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.544673 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.653552 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.728911 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.754958 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.778787 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.798566 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.871683 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.916062 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.950464 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.973671 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.975615 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.980389 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.981888 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 28 13:23:34 crc kubenswrapper[4970]: I1128 13:23:34.984547 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.041418 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.271304 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.349397 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.372353 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.393737 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.422761 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.428352 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.540372 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.544314 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.672208 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.690318 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.729596 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.745954 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.759057 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 28 13:23:35 crc kubenswrapper[4970]: I1128 13:23:35.900510 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.005797 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.144350 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.185517 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.214592 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.235862 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.366039 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.386544 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.397511 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.409861 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.550269 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.554051 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.559380 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.881907 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 28 13:23:36 crc kubenswrapper[4970]: I1128 13:23:36.916755 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.036111 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.093132 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.186037 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.226897 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.261065 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.397711 4970 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.398053 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96" gracePeriod=5 Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.414214 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.478040 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.624290 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.812556 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.853664 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.884237 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.919436 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 28 13:23:37 crc kubenswrapper[4970]: I1128 13:23:37.944850 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.089718 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.184362 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.238425 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.313745 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.340113 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.661868 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.773012 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.814378 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 28 13:23:38 crc kubenswrapper[4970]: I1128 13:23:38.864377 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.007951 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.082191 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.146450 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.469305 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.570922 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.581848 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.583922 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.614025 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 28 13:23:39 crc kubenswrapper[4970]: I1128 13:23:39.899287 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 28 13:23:40 crc kubenswrapper[4970]: I1128 13:23:40.032848 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 28 13:23:40 crc kubenswrapper[4970]: I1128 13:23:40.135685 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 28 13:23:40 crc kubenswrapper[4970]: I1128 13:23:40.329402 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 28 13:23:40 crc kubenswrapper[4970]: I1128 13:23:40.935033 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 28 13:23:42 crc kubenswrapper[4970]: I1128 13:23:42.969570 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 28 13:23:42 crc kubenswrapper[4970]: I1128 13:23:42.969955 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081626 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081727 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081802 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081885 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081897 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081932 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.081985 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.082085 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.082166 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.082560 4970 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.082584 4970 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.082602 4970 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.082624 4970 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.096406 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.103032 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.103138 4970 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96" exitCode=137 Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.103291 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.103299 4970 scope.go:117] "RemoveContainer" containerID="3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.151654 4970 scope.go:117] "RemoveContainer" containerID="3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96" Nov 28 13:23:43 crc kubenswrapper[4970]: E1128 13:23:43.155119 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96\": container with ID starting with 3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96 not found: ID does not exist" containerID="3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.155164 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96"} err="failed to get container status \"3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96\": rpc error: code = NotFound desc = could not find container \"3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96\": container with ID starting with 3bde70e430f719ea4d499e311810983b13de2afe93179bd480075516e18a7c96 not found: ID does not exist" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.183960 4970 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 28 13:23:43 crc kubenswrapper[4970]: I1128 13:23:43.393304 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 28 13:23:50 crc kubenswrapper[4970]: I1128 13:23:50.194378 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 28 13:23:50 crc kubenswrapper[4970]: I1128 13:23:50.501257 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 28 13:23:53 crc kubenswrapper[4970]: I1128 13:23:53.565075 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 28 13:23:54 crc kubenswrapper[4970]: I1128 13:23:54.489330 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 28 13:23:55 crc kubenswrapper[4970]: I1128 13:23:55.046389 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 28 13:23:55 crc kubenswrapper[4970]: I1128 13:23:55.094282 4970 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 28 13:23:55 crc kubenswrapper[4970]: I1128 13:23:55.736640 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 28 13:23:55 crc kubenswrapper[4970]: I1128 13:23:55.990167 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 28 13:23:56 crc kubenswrapper[4970]: I1128 13:23:56.168530 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 28 13:23:57 crc kubenswrapper[4970]: I1128 13:23:57.107790 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 28 13:23:57 crc kubenswrapper[4970]: I1128 13:23:57.888960 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 28 13:23:58 crc kubenswrapper[4970]: I1128 13:23:58.600862 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 28 13:23:58 crc kubenswrapper[4970]: I1128 13:23:58.856188 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 28 13:23:58 crc kubenswrapper[4970]: I1128 13:23:58.959989 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:23:58 crc kubenswrapper[4970]: I1128 13:23:58.977059 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 28 13:23:59 crc kubenswrapper[4970]: I1128 13:23:59.082287 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 28 13:23:59 crc kubenswrapper[4970]: I1128 13:23:59.818579 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 28 13:23:59 crc kubenswrapper[4970]: I1128 13:23:59.864419 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 28 13:24:00 crc kubenswrapper[4970]: I1128 13:24:00.260061 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 28 13:24:00 crc kubenswrapper[4970]: I1128 13:24:00.991261 4970 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 28 13:24:01 crc kubenswrapper[4970]: I1128 13:24:01.261366 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 28 13:24:01 crc kubenswrapper[4970]: I1128 13:24:01.397844 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 28 13:24:01 crc kubenswrapper[4970]: I1128 13:24:01.677528 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 28 13:24:02 crc kubenswrapper[4970]: I1128 13:24:02.343004 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 28 13:24:02 crc kubenswrapper[4970]: I1128 13:24:02.771364 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 28 13:24:03 crc kubenswrapper[4970]: I1128 13:24:03.430145 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 28 13:24:04 crc kubenswrapper[4970]: I1128 13:24:04.873081 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 28 13:24:05 crc kubenswrapper[4970]: I1128 13:24:05.524964 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 28 13:24:06 crc kubenswrapper[4970]: I1128 13:24:06.156598 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 28 13:24:06 crc kubenswrapper[4970]: I1128 13:24:06.554087 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 28 13:24:06 crc kubenswrapper[4970]: I1128 13:24:06.576395 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 28 13:24:07 crc kubenswrapper[4970]: I1128 13:24:07.254326 4970 generic.go:334] "Generic (PLEG): container finished" podID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerID="8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28" exitCode=0 Nov 28 13:24:07 crc kubenswrapper[4970]: I1128 13:24:07.254467 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" event={"ID":"514859c2-bd3c-4ccb-90b0-61180a1bc297","Type":"ContainerDied","Data":"8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28"} Nov 28 13:24:07 crc kubenswrapper[4970]: I1128 13:24:07.255485 4970 scope.go:117] "RemoveContainer" containerID="8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28" Nov 28 13:24:07 crc kubenswrapper[4970]: I1128 13:24:07.710477 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 28 13:24:08 crc kubenswrapper[4970]: I1128 13:24:08.004733 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 28 13:24:08 crc kubenswrapper[4970]: I1128 13:24:08.266311 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" event={"ID":"514859c2-bd3c-4ccb-90b0-61180a1bc297","Type":"ContainerStarted","Data":"69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf"} Nov 28 13:24:08 crc kubenswrapper[4970]: I1128 13:24:08.268201 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:24:08 crc kubenswrapper[4970]: I1128 13:24:08.272937 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:24:08 crc kubenswrapper[4970]: I1128 13:24:08.801821 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 28 13:24:09 crc kubenswrapper[4970]: I1128 13:24:09.716148 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 28 13:24:09 crc kubenswrapper[4970]: I1128 13:24:09.761353 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 28 13:24:09 crc kubenswrapper[4970]: I1128 13:24:09.806580 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 28 13:24:09 crc kubenswrapper[4970]: I1128 13:24:09.870268 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 28 13:24:10 crc kubenswrapper[4970]: I1128 13:24:10.920093 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 28 13:24:11 crc kubenswrapper[4970]: I1128 13:24:11.081237 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 28 13:24:11 crc kubenswrapper[4970]: I1128 13:24:11.270292 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 28 13:24:11 crc kubenswrapper[4970]: I1128 13:24:11.424040 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 28 13:24:12 crc kubenswrapper[4970]: I1128 13:24:12.565602 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 28 13:24:12 crc kubenswrapper[4970]: I1128 13:24:12.747609 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 28 13:24:14 crc kubenswrapper[4970]: I1128 13:24:14.253672 4970 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 28 13:24:14 crc kubenswrapper[4970]: I1128 13:24:14.583022 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 28 13:24:16 crc kubenswrapper[4970]: I1128 13:24:16.472487 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 28 13:24:19 crc kubenswrapper[4970]: I1128 13:24:19.919158 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 28 13:24:20 crc kubenswrapper[4970]: I1128 13:24:20.716537 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 28 13:24:23 crc kubenswrapper[4970]: I1128 13:24:23.503822 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 28 13:24:36 crc kubenswrapper[4970]: I1128 13:24:36.751644 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cbvqk"] Nov 28 13:24:36 crc kubenswrapper[4970]: I1128 13:24:36.752532 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" podUID="e80ce492-28d4-40cf-8a55-5a4f456e8255" containerName="controller-manager" containerID="cri-o://e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243" gracePeriod=30 Nov 28 13:24:36 crc kubenswrapper[4970]: I1128 13:24:36.852862 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk"] Nov 28 13:24:36 crc kubenswrapper[4970]: I1128 13:24:36.853106 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" podUID="900e9596-8294-4c4d-857a-1b2bf9adaca7" containerName="route-controller-manager" containerID="cri-o://f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8" gracePeriod=30 Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.177946 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.189453 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301617 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") pod \"e80ce492-28d4-40cf-8a55-5a4f456e8255\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301673 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-config\") pod \"e80ce492-28d4-40cf-8a55-5a4f456e8255\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301701 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") pod \"e80ce492-28d4-40cf-8a55-5a4f456e8255\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301727 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpv9z\" (UniqueName: \"kubernetes.io/projected/900e9596-8294-4c4d-857a-1b2bf9adaca7-kube-api-access-wpv9z\") pod \"900e9596-8294-4c4d-857a-1b2bf9adaca7\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301780 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-proxy-ca-bundles\") pod \"e80ce492-28d4-40cf-8a55-5a4f456e8255\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301799 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-config\") pod \"900e9596-8294-4c4d-857a-1b2bf9adaca7\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301818 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-client-ca\") pod \"900e9596-8294-4c4d-857a-1b2bf9adaca7\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301833 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pqp\" (UniqueName: \"kubernetes.io/projected/e80ce492-28d4-40cf-8a55-5a4f456e8255-kube-api-access-45pqp\") pod \"e80ce492-28d4-40cf-8a55-5a4f456e8255\" (UID: \"e80ce492-28d4-40cf-8a55-5a4f456e8255\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.301846 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900e9596-8294-4c4d-857a-1b2bf9adaca7-serving-cert\") pod \"900e9596-8294-4c4d-857a-1b2bf9adaca7\" (UID: \"900e9596-8294-4c4d-857a-1b2bf9adaca7\") " Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.302494 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-config" (OuterVolumeSpecName: "config") pod "e80ce492-28d4-40cf-8a55-5a4f456e8255" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.302517 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca" (OuterVolumeSpecName: "client-ca") pod "e80ce492-28d4-40cf-8a55-5a4f456e8255" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.302969 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-config" (OuterVolumeSpecName: "config") pod "900e9596-8294-4c4d-857a-1b2bf9adaca7" (UID: "900e9596-8294-4c4d-857a-1b2bf9adaca7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.303495 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e80ce492-28d4-40cf-8a55-5a4f456e8255" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.303717 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-client-ca" (OuterVolumeSpecName: "client-ca") pod "900e9596-8294-4c4d-857a-1b2bf9adaca7" (UID: "900e9596-8294-4c4d-857a-1b2bf9adaca7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.306867 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e9596-8294-4c4d-857a-1b2bf9adaca7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "900e9596-8294-4c4d-857a-1b2bf9adaca7" (UID: "900e9596-8294-4c4d-857a-1b2bf9adaca7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.306980 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900e9596-8294-4c4d-857a-1b2bf9adaca7-kube-api-access-wpv9z" (OuterVolumeSpecName: "kube-api-access-wpv9z") pod "900e9596-8294-4c4d-857a-1b2bf9adaca7" (UID: "900e9596-8294-4c4d-857a-1b2bf9adaca7"). InnerVolumeSpecName "kube-api-access-wpv9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.306986 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e80ce492-28d4-40cf-8a55-5a4f456e8255" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.307070 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80ce492-28d4-40cf-8a55-5a4f456e8255-kube-api-access-45pqp" (OuterVolumeSpecName: "kube-api-access-45pqp") pod "e80ce492-28d4-40cf-8a55-5a4f456e8255" (UID: "e80ce492-28d4-40cf-8a55-5a4f456e8255"). InnerVolumeSpecName "kube-api-access-45pqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402808 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402845 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e80ce492-28d4-40cf-8a55-5a4f456e8255-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402859 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpv9z\" (UniqueName: \"kubernetes.io/projected/900e9596-8294-4c4d-857a-1b2bf9adaca7-kube-api-access-wpv9z\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402872 4970 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402888 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402924 4970 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/900e9596-8294-4c4d-857a-1b2bf9adaca7-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402942 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pqp\" (UniqueName: \"kubernetes.io/projected/e80ce492-28d4-40cf-8a55-5a4f456e8255-kube-api-access-45pqp\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402957 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900e9596-8294-4c4d-857a-1b2bf9adaca7-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.402976 4970 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e80ce492-28d4-40cf-8a55-5a4f456e8255-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.451846 4970 generic.go:334] "Generic (PLEG): container finished" podID="e80ce492-28d4-40cf-8a55-5a4f456e8255" containerID="e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243" exitCode=0 Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.451919 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.451947 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" event={"ID":"e80ce492-28d4-40cf-8a55-5a4f456e8255","Type":"ContainerDied","Data":"e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243"} Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.451985 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cbvqk" event={"ID":"e80ce492-28d4-40cf-8a55-5a4f456e8255","Type":"ContainerDied","Data":"9b529263a5dbbe3335d70b89eaa6aa73285801afa02ebf6f982dadc80b9b751a"} Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.452012 4970 scope.go:117] "RemoveContainer" containerID="e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.458609 4970 generic.go:334] "Generic (PLEG): container finished" podID="900e9596-8294-4c4d-857a-1b2bf9adaca7" containerID="f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8" exitCode=0 Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.458653 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" event={"ID":"900e9596-8294-4c4d-857a-1b2bf9adaca7","Type":"ContainerDied","Data":"f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8"} Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.458721 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" event={"ID":"900e9596-8294-4c4d-857a-1b2bf9adaca7","Type":"ContainerDied","Data":"c907cca720cf1bfebb42cb4366e30c6dbc83e45aae70d971f2c27188571238b6"} Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.458747 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.472583 4970 scope.go:117] "RemoveContainer" containerID="e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243" Nov 28 13:24:37 crc kubenswrapper[4970]: E1128 13:24:37.473344 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243\": container with ID starting with e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243 not found: ID does not exist" containerID="e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.473396 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243"} err="failed to get container status \"e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243\": rpc error: code = NotFound desc = could not find container \"e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243\": container with ID starting with e676930d85edea2db1d850cb45eef6714b770e9ed8cf3d755518912ff0524243 not found: ID does not exist" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.473416 4970 scope.go:117] "RemoveContainer" containerID="f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.483791 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cbvqk"] Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.493164 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cbvqk"] Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.500866 4970 scope.go:117] "RemoveContainer" containerID="f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8" Nov 28 13:24:37 crc kubenswrapper[4970]: E1128 13:24:37.501404 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8\": container with ID starting with f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8 not found: ID does not exist" containerID="f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.501471 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8"} err="failed to get container status \"f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8\": rpc error: code = NotFound desc = could not find container \"f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8\": container with ID starting with f6b5c781ecb8244e4ea38fb275437d136647bdc806ff3b2ce2d00ad80aadf6d8 not found: ID does not exist" Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.502906 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk"] Nov 28 13:24:37 crc kubenswrapper[4970]: I1128 13:24:37.508039 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-znlgk"] Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.378943 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm"] Nov 28 13:24:38 crc kubenswrapper[4970]: E1128 13:24:38.379622 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900e9596-8294-4c4d-857a-1b2bf9adaca7" containerName="route-controller-manager" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379639 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="900e9596-8294-4c4d-857a-1b2bf9adaca7" containerName="route-controller-manager" Nov 28 13:24:38 crc kubenswrapper[4970]: E1128 13:24:38.379650 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379659 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 28 13:24:38 crc kubenswrapper[4970]: E1128 13:24:38.379670 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a609cfd-28dc-440a-8433-6933565864a7" containerName="installer" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379678 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a609cfd-28dc-440a-8433-6933565864a7" containerName="installer" Nov 28 13:24:38 crc kubenswrapper[4970]: E1128 13:24:38.379698 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80ce492-28d4-40cf-8a55-5a4f456e8255" containerName="controller-manager" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379706 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80ce492-28d4-40cf-8a55-5a4f456e8255" containerName="controller-manager" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379827 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379846 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a609cfd-28dc-440a-8433-6933565864a7" containerName="installer" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379858 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80ce492-28d4-40cf-8a55-5a4f456e8255" containerName="controller-manager" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.379872 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="900e9596-8294-4c4d-857a-1b2bf9adaca7" containerName="route-controller-manager" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.380540 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.385649 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.385700 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-574c848897-sqzn9"] Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.385925 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.386035 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.386146 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.386395 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.386663 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.386898 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.390119 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.391948 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.392736 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.394676 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.396589 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.400304 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.403514 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm"] Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.407853 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.419107 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-574c848897-sqzn9"] Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515360 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-client-ca\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515431 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4369b978-7720-433e-ba2a-ef658c76b0b2-serving-cert\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515519 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-client-ca\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515573 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c98945-289e-4bad-ae74-d0a4feada930-serving-cert\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515607 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-config\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515688 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csst\" (UniqueName: \"kubernetes.io/projected/a4c98945-289e-4bad-ae74-d0a4feada930-kube-api-access-6csst\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515727 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-config\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515789 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4gnr\" (UniqueName: \"kubernetes.io/projected/4369b978-7720-433e-ba2a-ef658c76b0b2-kube-api-access-l4gnr\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.515834 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-proxy-ca-bundles\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.616907 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6csst\" (UniqueName: \"kubernetes.io/projected/a4c98945-289e-4bad-ae74-d0a4feada930-kube-api-access-6csst\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.616984 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-config\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617061 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4gnr\" (UniqueName: \"kubernetes.io/projected/4369b978-7720-433e-ba2a-ef658c76b0b2-kube-api-access-l4gnr\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617112 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-proxy-ca-bundles\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617186 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-client-ca\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617242 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4369b978-7720-433e-ba2a-ef658c76b0b2-serving-cert\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617283 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-client-ca\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617316 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c98945-289e-4bad-ae74-d0a4feada930-serving-cert\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.617345 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-config\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.618839 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-proxy-ca-bundles\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.618864 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-client-ca\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.618954 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-client-ca\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.619520 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-config\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.619622 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-config\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.624976 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c98945-289e-4bad-ae74-d0a4feada930-serving-cert\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.633500 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4369b978-7720-433e-ba2a-ef658c76b0b2-serving-cert\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.650007 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csst\" (UniqueName: \"kubernetes.io/projected/a4c98945-289e-4bad-ae74-d0a4feada930-kube-api-access-6csst\") pod \"controller-manager-574c848897-sqzn9\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.662890 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4gnr\" (UniqueName: \"kubernetes.io/projected/4369b978-7720-433e-ba2a-ef658c76b0b2-kube-api-access-l4gnr\") pod \"route-controller-manager-657fdc8645-zvsmm\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.705518 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.721092 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:38 crc kubenswrapper[4970]: I1128 13:24:38.953059 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm"] Nov 28 13:24:38 crc kubenswrapper[4970]: W1128 13:24:38.957705 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4369b978_7720_433e_ba2a_ef658c76b0b2.slice/crio-884865bd2c0eae9795b57ff7486cc8912912d5b68651ef4a07c3a4b24f3acc08 WatchSource:0}: Error finding container 884865bd2c0eae9795b57ff7486cc8912912d5b68651ef4a07c3a4b24f3acc08: Status 404 returned error can't find the container with id 884865bd2c0eae9795b57ff7486cc8912912d5b68651ef4a07c3a4b24f3acc08 Nov 28 13:24:39 crc kubenswrapper[4970]: I1128 13:24:39.001713 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-574c848897-sqzn9"] Nov 28 13:24:39 crc kubenswrapper[4970]: W1128 13:24:39.006879 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c98945_289e_4bad_ae74_d0a4feada930.slice/crio-0530b5c7ef19f10c6ffc26ea83f9cba094dadaf2c95cb8d7044ce5b0172a4054 WatchSource:0}: Error finding container 0530b5c7ef19f10c6ffc26ea83f9cba094dadaf2c95cb8d7044ce5b0172a4054: Status 404 returned error can't find the container with id 0530b5c7ef19f10c6ffc26ea83f9cba094dadaf2c95cb8d7044ce5b0172a4054 Nov 28 13:24:39 crc kubenswrapper[4970]: I1128 13:24:39.392049 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900e9596-8294-4c4d-857a-1b2bf9adaca7" path="/var/lib/kubelet/pods/900e9596-8294-4c4d-857a-1b2bf9adaca7/volumes" Nov 28 13:24:39 crc kubenswrapper[4970]: I1128 13:24:39.393373 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80ce492-28d4-40cf-8a55-5a4f456e8255" path="/var/lib/kubelet/pods/e80ce492-28d4-40cf-8a55-5a4f456e8255/volumes" Nov 28 13:24:39 crc kubenswrapper[4970]: I1128 13:24:39.477468 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" event={"ID":"4369b978-7720-433e-ba2a-ef658c76b0b2","Type":"ContainerStarted","Data":"884865bd2c0eae9795b57ff7486cc8912912d5b68651ef4a07c3a4b24f3acc08"} Nov 28 13:24:39 crc kubenswrapper[4970]: I1128 13:24:39.479378 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" event={"ID":"a4c98945-289e-4bad-ae74-d0a4feada930","Type":"ContainerStarted","Data":"0530b5c7ef19f10c6ffc26ea83f9cba094dadaf2c95cb8d7044ce5b0172a4054"} Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.485919 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" event={"ID":"a4c98945-289e-4bad-ae74-d0a4feada930","Type":"ContainerStarted","Data":"d1641923deb8c34d166781b8fd3e9cdd0c4ec1f0439a97161948a9568ee5082d"} Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.486200 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.489601 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" event={"ID":"4369b978-7720-433e-ba2a-ef658c76b0b2","Type":"ContainerStarted","Data":"5dc9690a7db96736a9c9fd1b5870947b048d3403f7810882625e37bb5e25d7ff"} Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.489879 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.495109 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.501009 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.507650 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" podStartSLOduration=3.507632057 podStartE2EDuration="3.507632057s" podCreationTimestamp="2025-11-28 13:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:40.506432691 +0000 UTC m=+291.359314511" watchObservedRunningTime="2025-11-28 13:24:40.507632057 +0000 UTC m=+291.360513867" Nov 28 13:24:40 crc kubenswrapper[4970]: I1128 13:24:40.581786 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" podStartSLOduration=3.581765168 podStartE2EDuration="3.581765168s" podCreationTimestamp="2025-11-28 13:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:40.578095137 +0000 UTC m=+291.430976947" watchObservedRunningTime="2025-11-28 13:24:40.581765168 +0000 UTC m=+291.434646968" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.021819 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzvwp"] Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.023005 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nzvwp" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="registry-server" containerID="cri-o://4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc" gracePeriod=2 Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.220095 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp5dk"] Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.220334 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fp5dk" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="registry-server" containerID="cri-o://49063d2a8e369100f42c57c9f14ee6d892d09e8e1f031f7b9fccc09f1ac7be64" gracePeriod=2 Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.520453 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.558891 4970 generic.go:334] "Generic (PLEG): container finished" podID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerID="49063d2a8e369100f42c57c9f14ee6d892d09e8e1f031f7b9fccc09f1ac7be64" exitCode=0 Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.558961 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerDied","Data":"49063d2a8e369100f42c57c9f14ee6d892d09e8e1f031f7b9fccc09f1ac7be64"} Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.560722 4970 generic.go:334] "Generic (PLEG): container finished" podID="4cae710a-4284-4d76-b507-d7aa55adba72" containerID="4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc" exitCode=0 Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.560746 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzvwp" event={"ID":"4cae710a-4284-4d76-b507-d7aa55adba72","Type":"ContainerDied","Data":"4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc"} Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.560765 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzvwp" event={"ID":"4cae710a-4284-4d76-b507-d7aa55adba72","Type":"ContainerDied","Data":"608e565eced9fac804dc65085933b01f5b3309ad1e04b99b18e6465efaf67484"} Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.560792 4970 scope.go:117] "RemoveContainer" containerID="4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.560871 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzvwp" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.590717 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-utilities\") pod \"4cae710a-4284-4d76-b507-d7aa55adba72\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.590778 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr975\" (UniqueName: \"kubernetes.io/projected/4cae710a-4284-4d76-b507-d7aa55adba72-kube-api-access-pr975\") pod \"4cae710a-4284-4d76-b507-d7aa55adba72\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.590860 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-catalog-content\") pod \"4cae710a-4284-4d76-b507-d7aa55adba72\" (UID: \"4cae710a-4284-4d76-b507-d7aa55adba72\") " Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.591752 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-utilities" (OuterVolumeSpecName: "utilities") pod "4cae710a-4284-4d76-b507-d7aa55adba72" (UID: "4cae710a-4284-4d76-b507-d7aa55adba72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.595578 4970 scope.go:117] "RemoveContainer" containerID="199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.604435 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cae710a-4284-4d76-b507-d7aa55adba72-kube-api-access-pr975" (OuterVolumeSpecName: "kube-api-access-pr975") pod "4cae710a-4284-4d76-b507-d7aa55adba72" (UID: "4cae710a-4284-4d76-b507-d7aa55adba72"). InnerVolumeSpecName "kube-api-access-pr975". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.607424 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cae710a-4284-4d76-b507-d7aa55adba72" (UID: "4cae710a-4284-4d76-b507-d7aa55adba72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.632246 4970 scope.go:117] "RemoveContainer" containerID="b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.645439 4970 scope.go:117] "RemoveContainer" containerID="4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc" Nov 28 13:24:51 crc kubenswrapper[4970]: E1128 13:24:51.645843 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc\": container with ID starting with 4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc not found: ID does not exist" containerID="4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.645874 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc"} err="failed to get container status \"4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc\": rpc error: code = NotFound desc = could not find container \"4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc\": container with ID starting with 4460ae67a212301ea161ef043a360dfa83a368ae92cb4fe6950f58a61b0899cc not found: ID does not exist" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.645894 4970 scope.go:117] "RemoveContainer" containerID="199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4" Nov 28 13:24:51 crc kubenswrapper[4970]: E1128 13:24:51.646971 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4\": container with ID starting with 199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4 not found: ID does not exist" containerID="199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.647003 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4"} err="failed to get container status \"199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4\": rpc error: code = NotFound desc = could not find container \"199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4\": container with ID starting with 199e4a0230ff86d77ea52d3e3936209bc759b65e09d92423fffbcc087ab1f9e4 not found: ID does not exist" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.647022 4970 scope.go:117] "RemoveContainer" containerID="b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5" Nov 28 13:24:51 crc kubenswrapper[4970]: E1128 13:24:51.647285 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5\": container with ID starting with b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5 not found: ID does not exist" containerID="b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.647305 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5"} err="failed to get container status \"b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5\": rpc error: code = NotFound desc = could not find container \"b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5\": container with ID starting with b57a59417898fc205319aa12c7e8a7a4eacd104247be5f4b01e36f5b17fc90e5 not found: ID does not exist" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.674107 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.691987 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-utilities\") pod \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.692075 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdkxs\" (UniqueName: \"kubernetes.io/projected/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-kube-api-access-tdkxs\") pod \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.692146 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-catalog-content\") pod \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\" (UID: \"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44\") " Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.692401 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.692423 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cae710a-4284-4d76-b507-d7aa55adba72-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.692436 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr975\" (UniqueName: \"kubernetes.io/projected/4cae710a-4284-4d76-b507-d7aa55adba72-kube-api-access-pr975\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.692718 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-utilities" (OuterVolumeSpecName: "utilities") pod "7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" (UID: "7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.749481 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-kube-api-access-tdkxs" (OuterVolumeSpecName: "kube-api-access-tdkxs") pod "7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" (UID: "7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44"). InnerVolumeSpecName "kube-api-access-tdkxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.796646 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.796677 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdkxs\" (UniqueName: \"kubernetes.io/projected/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-kube-api-access-tdkxs\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.804916 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" (UID: "7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.887889 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzvwp"] Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.892733 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzvwp"] Nov 28 13:24:51 crc kubenswrapper[4970]: I1128 13:24:51.898392 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.569661 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5dk" Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.569663 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5dk" event={"ID":"7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44","Type":"ContainerDied","Data":"48f75a5fce6a9a368f5ce73833ee59ddcc53540ea63ec8ccb6d954e252cfd5b4"} Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.570104 4970 scope.go:117] "RemoveContainer" containerID="49063d2a8e369100f42c57c9f14ee6d892d09e8e1f031f7b9fccc09f1ac7be64" Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.588054 4970 scope.go:117] "RemoveContainer" containerID="1ed8a9fa1ce3fdc311f9a2e6537817b63a75f66ee54ad495c9c87a816e94986a" Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.601315 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp5dk"] Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.609897 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fp5dk"] Nov 28 13:24:52 crc kubenswrapper[4970]: I1128 13:24:52.624613 4970 scope.go:117] "RemoveContainer" containerID="32586cf197490c07dac50c81b83615ac99ca6abe43778b63e072f8737d6d7b65" Nov 28 13:24:53 crc kubenswrapper[4970]: I1128 13:24:53.390892 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" path="/var/lib/kubelet/pods/4cae710a-4284-4d76-b507-d7aa55adba72/volumes" Nov 28 13:24:53 crc kubenswrapper[4970]: I1128 13:24:53.392314 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" path="/var/lib/kubelet/pods/7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44/volumes" Nov 28 13:24:53 crc kubenswrapper[4970]: I1128 13:24:53.419785 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jvzw"] Nov 28 13:24:53 crc kubenswrapper[4970]: I1128 13:24:53.420030 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jvzw" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="registry-server" containerID="cri-o://8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a" gracePeriod=2 Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.351652 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.529900 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-catalog-content\") pod \"ea4c7183-d326-44cd-8e40-649e3dad901e\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.529992 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-utilities\") pod \"ea4c7183-d326-44cd-8e40-649e3dad901e\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.530038 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2dl\" (UniqueName: \"kubernetes.io/projected/ea4c7183-d326-44cd-8e40-649e3dad901e-kube-api-access-vr2dl\") pod \"ea4c7183-d326-44cd-8e40-649e3dad901e\" (UID: \"ea4c7183-d326-44cd-8e40-649e3dad901e\") " Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.531037 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-utilities" (OuterVolumeSpecName: "utilities") pod "ea4c7183-d326-44cd-8e40-649e3dad901e" (UID: "ea4c7183-d326-44cd-8e40-649e3dad901e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.535660 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4c7183-d326-44cd-8e40-649e3dad901e-kube-api-access-vr2dl" (OuterVolumeSpecName: "kube-api-access-vr2dl") pod "ea4c7183-d326-44cd-8e40-649e3dad901e" (UID: "ea4c7183-d326-44cd-8e40-649e3dad901e"). InnerVolumeSpecName "kube-api-access-vr2dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.580141 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea4c7183-d326-44cd-8e40-649e3dad901e" (UID: "ea4c7183-d326-44cd-8e40-649e3dad901e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.584171 4970 generic.go:334] "Generic (PLEG): container finished" podID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerID="8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a" exitCode=0 Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.584234 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jvzw" event={"ID":"ea4c7183-d326-44cd-8e40-649e3dad901e","Type":"ContainerDied","Data":"8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a"} Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.584274 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jvzw" event={"ID":"ea4c7183-d326-44cd-8e40-649e3dad901e","Type":"ContainerDied","Data":"411d2fc0ceb52772aa19488c50007a1434462d8782dc69f22d7195a1989afc09"} Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.584296 4970 scope.go:117] "RemoveContainer" containerID="8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.584309 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jvzw" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.622839 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jvzw"] Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.622887 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jvzw"] Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.631429 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.631452 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4c7183-d326-44cd-8e40-649e3dad901e-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.631466 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2dl\" (UniqueName: \"kubernetes.io/projected/ea4c7183-d326-44cd-8e40-649e3dad901e-kube-api-access-vr2dl\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.631695 4970 scope.go:117] "RemoveContainer" containerID="099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.650804 4970 scope.go:117] "RemoveContainer" containerID="e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.678268 4970 scope.go:117] "RemoveContainer" containerID="8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a" Nov 28 13:24:54 crc kubenswrapper[4970]: E1128 13:24:54.678779 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a\": container with ID starting with 8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a not found: ID does not exist" containerID="8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.678814 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a"} err="failed to get container status \"8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a\": rpc error: code = NotFound desc = could not find container \"8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a\": container with ID starting with 8f48410108384348a23dcc8ad95b89dd4a08166513e5ce8137535e346aee3a6a not found: ID does not exist" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.678838 4970 scope.go:117] "RemoveContainer" containerID="099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826" Nov 28 13:24:54 crc kubenswrapper[4970]: E1128 13:24:54.679118 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826\": container with ID starting with 099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826 not found: ID does not exist" containerID="099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.679145 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826"} err="failed to get container status \"099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826\": rpc error: code = NotFound desc = could not find container \"099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826\": container with ID starting with 099d18e9871153a8d25104339be29911068bc6a9315d9bf99e798b8b6077b826 not found: ID does not exist" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.679160 4970 scope.go:117] "RemoveContainer" containerID="e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3" Nov 28 13:24:54 crc kubenswrapper[4970]: E1128 13:24:54.679495 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3\": container with ID starting with e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3 not found: ID does not exist" containerID="e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3" Nov 28 13:24:54 crc kubenswrapper[4970]: I1128 13:24:54.679518 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3"} err="failed to get container status \"e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3\": rpc error: code = NotFound desc = could not find container \"e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3\": container with ID starting with e7fa2e9098159163db26df972fe0f1a1fbeec851879837c3d9e3e9c90397ebf3 not found: ID does not exist" Nov 28 13:24:55 crc kubenswrapper[4970]: I1128 13:24:55.389600 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" path="/var/lib/kubelet/pods/ea4c7183-d326-44cd-8e40-649e3dad901e/volumes" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070269 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zmjh4"] Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070765 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="extract-content" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070780 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="extract-content" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070795 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070803 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070820 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="extract-utilities" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070829 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="extract-utilities" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070839 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="extract-utilities" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070847 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="extract-utilities" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070860 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="extract-content" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070868 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="extract-content" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070884 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="extract-content" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070893 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="extract-content" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070904 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070912 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070921 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070928 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: E1128 13:24:56.070941 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="extract-utilities" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.070949 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="extract-utilities" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.071056 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4c7183-d326-44cd-8e40-649e3dad901e" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.071071 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9651e6-3bf3-4aad-b2ef-8fb6c42eea44" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.071081 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cae710a-4284-4d76-b507-d7aa55adba72" containerName="registry-server" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.071513 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.086126 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zmjh4"] Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268475 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad4b13e5-6e74-4244-a034-229e503bea32-trusted-ca\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268552 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad4b13e5-6e74-4244-a034-229e503bea32-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268613 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pwv\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-kube-api-access-n2pwv\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268640 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad4b13e5-6e74-4244-a034-229e503bea32-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268673 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268705 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-registry-tls\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268738 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-bound-sa-token\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.268773 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad4b13e5-6e74-4244-a034-229e503bea32-registry-certificates\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.288071 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370323 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad4b13e5-6e74-4244-a034-229e503bea32-registry-certificates\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370368 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad4b13e5-6e74-4244-a034-229e503bea32-trusted-ca\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370405 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad4b13e5-6e74-4244-a034-229e503bea32-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370447 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pwv\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-kube-api-access-n2pwv\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370469 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad4b13e5-6e74-4244-a034-229e503bea32-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370491 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-registry-tls\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.370516 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-bound-sa-token\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.371499 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ad4b13e5-6e74-4244-a034-229e503bea32-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.371803 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ad4b13e5-6e74-4244-a034-229e503bea32-registry-certificates\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.372414 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad4b13e5-6e74-4244-a034-229e503bea32-trusted-ca\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.378593 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-registry-tls\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.378979 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ad4b13e5-6e74-4244-a034-229e503bea32-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.388691 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pwv\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-kube-api-access-n2pwv\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.398765 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad4b13e5-6e74-4244-a034-229e503bea32-bound-sa-token\") pod \"image-registry-66df7c8f76-zmjh4\" (UID: \"ad4b13e5-6e74-4244-a034-229e503bea32\") " pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:56 crc kubenswrapper[4970]: I1128 13:24:56.695340 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:57 crc kubenswrapper[4970]: I1128 13:24:57.132160 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zmjh4"] Nov 28 13:24:57 crc kubenswrapper[4970]: W1128 13:24:57.141320 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4b13e5_6e74_4244_a034_229e503bea32.slice/crio-0ac05591aa4ce7bb810463a075132568c55cde451b5065f539b3f75ba844b1a4 WatchSource:0}: Error finding container 0ac05591aa4ce7bb810463a075132568c55cde451b5065f539b3f75ba844b1a4: Status 404 returned error can't find the container with id 0ac05591aa4ce7bb810463a075132568c55cde451b5065f539b3f75ba844b1a4 Nov 28 13:24:57 crc kubenswrapper[4970]: I1128 13:24:57.617794 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" event={"ID":"ad4b13e5-6e74-4244-a034-229e503bea32","Type":"ContainerStarted","Data":"4bfe6a6e46e31713eabb9aaf4a0da3b64493204712d03dc7979c0e743fcbde19"} Nov 28 13:24:57 crc kubenswrapper[4970]: I1128 13:24:57.617852 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" event={"ID":"ad4b13e5-6e74-4244-a034-229e503bea32","Type":"ContainerStarted","Data":"0ac05591aa4ce7bb810463a075132568c55cde451b5065f539b3f75ba844b1a4"} Nov 28 13:24:57 crc kubenswrapper[4970]: I1128 13:24:57.618722 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.716669 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" podStartSLOduration=2.7166448450000003 podStartE2EDuration="2.716644845s" podCreationTimestamp="2025-11-28 13:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:24:57.64612907 +0000 UTC m=+308.499010900" watchObservedRunningTime="2025-11-28 13:24:58.716644845 +0000 UTC m=+309.569526645" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.725365 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcv9n"] Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.725853 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jcv9n" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="registry-server" containerID="cri-o://372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa" gracePeriod=30 Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.734034 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mwg2n"] Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.734326 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mwg2n" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="registry-server" containerID="cri-o://56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45" gracePeriod=30 Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.756799 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nccsb"] Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.757104 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" containerID="cri-o://69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf" gracePeriod=30 Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.773780 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hjm2"] Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.774186 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8hjm2" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="registry-server" containerID="cri-o://61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a" gracePeriod=30 Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.787160 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwr74"] Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.787636 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wwr74" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="registry-server" containerID="cri-o://e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" gracePeriod=30 Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.796717 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6lxld"] Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.797631 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.798004 4970 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nccsb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.798102 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.810303 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6lxld"] Nov 28 13:24:58 crc kubenswrapper[4970]: E1128 13:24:58.843988 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75 is running failed: container process not found" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:24:58 crc kubenswrapper[4970]: E1128 13:24:58.844368 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75 is running failed: container process not found" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:24:58 crc kubenswrapper[4970]: E1128 13:24:58.844607 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75 is running failed: container process not found" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:24:58 crc kubenswrapper[4970]: E1128 13:24:58.844656 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-wwr74" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="registry-server" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.930531 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc9b\" (UniqueName: \"kubernetes.io/projected/cea9d8a5-d14f-4f9c-a800-815168dd799e-kube-api-access-bkc9b\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.930652 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cea9d8a5-d14f-4f9c-a800-815168dd799e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:58 crc kubenswrapper[4970]: I1128 13:24:58.930690 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cea9d8a5-d14f-4f9c-a800-815168dd799e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.031745 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cea9d8a5-d14f-4f9c-a800-815168dd799e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.031800 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cea9d8a5-d14f-4f9c-a800-815168dd799e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.031837 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc9b\" (UniqueName: \"kubernetes.io/projected/cea9d8a5-d14f-4f9c-a800-815168dd799e-kube-api-access-bkc9b\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.033163 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cea9d8a5-d14f-4f9c-a800-815168dd799e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.038358 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cea9d8a5-d14f-4f9c-a800-815168dd799e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.050642 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc9b\" (UniqueName: \"kubernetes.io/projected/cea9d8a5-d14f-4f9c-a800-815168dd799e-kube-api-access-bkc9b\") pod \"marketplace-operator-79b997595-6lxld\" (UID: \"cea9d8a5-d14f-4f9c-a800-815168dd799e\") " pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.213721 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.290417 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.448717 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnsfh\" (UniqueName: \"kubernetes.io/projected/619af67d-331c-4b38-b536-269ba823fd75-kube-api-access-jnsfh\") pod \"619af67d-331c-4b38-b536-269ba823fd75\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.448766 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-catalog-content\") pod \"619af67d-331c-4b38-b536-269ba823fd75\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.448864 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-utilities\") pod \"619af67d-331c-4b38-b536-269ba823fd75\" (UID: \"619af67d-331c-4b38-b536-269ba823fd75\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.450031 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-utilities" (OuterVolumeSpecName: "utilities") pod "619af67d-331c-4b38-b536-269ba823fd75" (UID: "619af67d-331c-4b38-b536-269ba823fd75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.453410 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619af67d-331c-4b38-b536-269ba823fd75-kube-api-access-jnsfh" (OuterVolumeSpecName: "kube-api-access-jnsfh") pod "619af67d-331c-4b38-b536-269ba823fd75" (UID: "619af67d-331c-4b38-b536-269ba823fd75"). InnerVolumeSpecName "kube-api-access-jnsfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.497506 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.506749 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.526954 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "619af67d-331c-4b38-b536-269ba823fd75" (UID: "619af67d-331c-4b38-b536-269ba823fd75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.550729 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.550763 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnsfh\" (UniqueName: \"kubernetes.io/projected/619af67d-331c-4b38-b536-269ba823fd75-kube-api-access-jnsfh\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.550774 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619af67d-331c-4b38-b536-269ba823fd75-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.553147 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.567379 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.629021 4970 generic.go:334] "Generic (PLEG): container finished" podID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" exitCode=0 Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.629073 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwr74" event={"ID":"a92f4929-dcab-4362-a5f2-c648f274bf04","Type":"ContainerDied","Data":"e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.629097 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwr74" event={"ID":"a92f4929-dcab-4362-a5f2-c648f274bf04","Type":"ContainerDied","Data":"0f19a79a745538c2959f7eaf13d5862d155dbeea72a7c516e03cfa85ed864e9a"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.629113 4970 scope.go:117] "RemoveContainer" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.629236 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwr74" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.636942 4970 generic.go:334] "Generic (PLEG): container finished" podID="619af67d-331c-4b38-b536-269ba823fd75" containerID="372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa" exitCode=0 Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.637034 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcv9n" event={"ID":"619af67d-331c-4b38-b536-269ba823fd75","Type":"ContainerDied","Data":"372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.637062 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcv9n" event={"ID":"619af67d-331c-4b38-b536-269ba823fd75","Type":"ContainerDied","Data":"28be4c283cf4514048523c6a051008763a544e99a642d0a9f6dcebb2c99341ed"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.637126 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcv9n" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.643282 4970 generic.go:334] "Generic (PLEG): container finished" podID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerID="69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf" exitCode=0 Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.643437 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.643470 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" event={"ID":"514859c2-bd3c-4ccb-90b0-61180a1bc297","Type":"ContainerDied","Data":"69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.643498 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nccsb" event={"ID":"514859c2-bd3c-4ccb-90b0-61180a1bc297","Type":"ContainerDied","Data":"c47d3ee97917003dda9f2a5b183cf5f436c96b979077fc232c4e861fa33d4ef8"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651118 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca\") pod \"514859c2-bd3c-4ccb-90b0-61180a1bc297\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651134 4970 generic.go:334] "Generic (PLEG): container finished" podID="fdf78924-9472-414e-baf6-822e511c464c" containerID="56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45" exitCode=0 Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651192 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-utilities\") pod \"fdf78924-9472-414e-baf6-822e511c464c\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651235 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-utilities\") pod \"a92f4929-dcab-4362-a5f2-c648f274bf04\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651262 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwg2n" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651715 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "514859c2-bd3c-4ccb-90b0-61180a1bc297" (UID: "514859c2-bd3c-4ccb-90b0-61180a1bc297"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651265 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-catalog-content\") pod \"fdf78924-9472-414e-baf6-822e511c464c\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651785 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-utilities\") pod \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651810 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7m4\" (UniqueName: \"kubernetes.io/projected/514859c2-bd3c-4ccb-90b0-61180a1bc297-kube-api-access-jt7m4\") pod \"514859c2-bd3c-4ccb-90b0-61180a1bc297\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651831 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-catalog-content\") pod \"a92f4929-dcab-4362-a5f2-c648f274bf04\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651878 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics\") pod \"514859c2-bd3c-4ccb-90b0-61180a1bc297\" (UID: \"514859c2-bd3c-4ccb-90b0-61180a1bc297\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651916 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjh5c\" (UniqueName: \"kubernetes.io/projected/a92f4929-dcab-4362-a5f2-c648f274bf04-kube-api-access-fjh5c\") pod \"a92f4929-dcab-4362-a5f2-c648f274bf04\" (UID: \"a92f4929-dcab-4362-a5f2-c648f274bf04\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651947 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-catalog-content\") pod \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.651986 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/ac87c2e3-5a6b-4998-8db7-165e571f6f52-kube-api-access-5h75m\") pod \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\" (UID: \"ac87c2e3-5a6b-4998-8db7-165e571f6f52\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.652019 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jht9\" (UniqueName: \"kubernetes.io/projected/fdf78924-9472-414e-baf6-822e511c464c-kube-api-access-6jht9\") pod \"fdf78924-9472-414e-baf6-822e511c464c\" (UID: \"fdf78924-9472-414e-baf6-822e511c464c\") " Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.652331 4970 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.652345 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-utilities" (OuterVolumeSpecName: "utilities") pod "fdf78924-9472-414e-baf6-822e511c464c" (UID: "fdf78924-9472-414e-baf6-822e511c464c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.652838 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerDied","Data":"56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.652871 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwg2n" event={"ID":"fdf78924-9472-414e-baf6-822e511c464c","Type":"ContainerDied","Data":"dc6ae09621ca3f4bb4d17e2272a3fa66143085d76106685f129f884cf64db7fe"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.654068 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-utilities" (OuterVolumeSpecName: "utilities") pod "a92f4929-dcab-4362-a5f2-c648f274bf04" (UID: "a92f4929-dcab-4362-a5f2-c648f274bf04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.654404 4970 scope.go:117] "RemoveContainer" containerID="086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.655788 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-utilities" (OuterVolumeSpecName: "utilities") pod "ac87c2e3-5a6b-4998-8db7-165e571f6f52" (UID: "ac87c2e3-5a6b-4998-8db7-165e571f6f52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.657771 4970 generic.go:334] "Generic (PLEG): container finished" podID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerID="61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a" exitCode=0 Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.658049 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hjm2" event={"ID":"ac87c2e3-5a6b-4998-8db7-165e571f6f52","Type":"ContainerDied","Data":"61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.658164 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hjm2" event={"ID":"ac87c2e3-5a6b-4998-8db7-165e571f6f52","Type":"ContainerDied","Data":"9f6690e4b3da7cb1949cd7350c0cd605e2b753e77d450c0302323c8b4f5b87e0"} Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.658361 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92f4929-dcab-4362-a5f2-c648f274bf04-kube-api-access-fjh5c" (OuterVolumeSpecName: "kube-api-access-fjh5c") pod "a92f4929-dcab-4362-a5f2-c648f274bf04" (UID: "a92f4929-dcab-4362-a5f2-c648f274bf04"). InnerVolumeSpecName "kube-api-access-fjh5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.658554 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hjm2" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.658987 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514859c2-bd3c-4ccb-90b0-61180a1bc297-kube-api-access-jt7m4" (OuterVolumeSpecName: "kube-api-access-jt7m4") pod "514859c2-bd3c-4ccb-90b0-61180a1bc297" (UID: "514859c2-bd3c-4ccb-90b0-61180a1bc297"). InnerVolumeSpecName "kube-api-access-jt7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.659457 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac87c2e3-5a6b-4998-8db7-165e571f6f52-kube-api-access-5h75m" (OuterVolumeSpecName: "kube-api-access-5h75m") pod "ac87c2e3-5a6b-4998-8db7-165e571f6f52" (UID: "ac87c2e3-5a6b-4998-8db7-165e571f6f52"). InnerVolumeSpecName "kube-api-access-5h75m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.659992 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf78924-9472-414e-baf6-822e511c464c-kube-api-access-6jht9" (OuterVolumeSpecName: "kube-api-access-6jht9") pod "fdf78924-9472-414e-baf6-822e511c464c" (UID: "fdf78924-9472-414e-baf6-822e511c464c"). InnerVolumeSpecName "kube-api-access-6jht9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.663545 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "514859c2-bd3c-4ccb-90b0-61180a1bc297" (UID: "514859c2-bd3c-4ccb-90b0-61180a1bc297"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.684001 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcv9n"] Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.687840 4970 scope.go:117] "RemoveContainer" containerID="4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.689231 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jcv9n"] Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.699347 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac87c2e3-5a6b-4998-8db7-165e571f6f52" (UID: "ac87c2e3-5a6b-4998-8db7-165e571f6f52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.706022 4970 scope.go:117] "RemoveContainer" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.706510 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75\": container with ID starting with e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75 not found: ID does not exist" containerID="e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.706543 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75"} err="failed to get container status \"e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75\": rpc error: code = NotFound desc = could not find container \"e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75\": container with ID starting with e42596a6fd10a064e3a8d5ef2207ddfd0fe766ec8ad644cb119a9d63d9354d75 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.706564 4970 scope.go:117] "RemoveContainer" containerID="086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.707018 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69\": container with ID starting with 086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69 not found: ID does not exist" containerID="086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.707075 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69"} err="failed to get container status \"086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69\": rpc error: code = NotFound desc = could not find container \"086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69\": container with ID starting with 086d1c6c32439850e2b06584180f0ca96a77b64a3d5bf6b3abf029c47e1d7b69 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.707111 4970 scope.go:117] "RemoveContainer" containerID="4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.707521 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343\": container with ID starting with 4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343 not found: ID does not exist" containerID="4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.707548 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343"} err="failed to get container status \"4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343\": rpc error: code = NotFound desc = could not find container \"4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343\": container with ID starting with 4d61510805f9bef0fada40462a3ee2a09ab6aa97240ca26693a039d7f1fa5343 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.707562 4970 scope.go:117] "RemoveContainer" containerID="372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.731000 4970 scope.go:117] "RemoveContainer" containerID="a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.731088 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdf78924-9472-414e-baf6-822e511c464c" (UID: "fdf78924-9472-414e-baf6-822e511c464c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.743135 4970 scope.go:117] "RemoveContainer" containerID="724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.755616 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h75m\" (UniqueName: \"kubernetes.io/projected/ac87c2e3-5a6b-4998-8db7-165e571f6f52-kube-api-access-5h75m\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.757663 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jht9\" (UniqueName: \"kubernetes.io/projected/fdf78924-9472-414e-baf6-822e511c464c-kube-api-access-6jht9\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.757737 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.757801 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.757872 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf78924-9472-414e-baf6-822e511c464c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.757942 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.758019 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7m4\" (UniqueName: \"kubernetes.io/projected/514859c2-bd3c-4ccb-90b0-61180a1bc297-kube-api-access-jt7m4\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.758078 4970 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/514859c2-bd3c-4ccb-90b0-61180a1bc297-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.758141 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjh5c\" (UniqueName: \"kubernetes.io/projected/a92f4929-dcab-4362-a5f2-c648f274bf04-kube-api-access-fjh5c\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.758228 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac87c2e3-5a6b-4998-8db7-165e571f6f52-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.758605 4970 scope.go:117] "RemoveContainer" containerID="372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.759066 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa\": container with ID starting with 372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa not found: ID does not exist" containerID="372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.759125 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa"} err="failed to get container status \"372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa\": rpc error: code = NotFound desc = could not find container \"372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa\": container with ID starting with 372b111f33d89b882be0cd06962157092d2b2f730628daa2a1f56f329c7476aa not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.759195 4970 scope.go:117] "RemoveContainer" containerID="a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.759958 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d\": container with ID starting with a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d not found: ID does not exist" containerID="a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.760125 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d"} err="failed to get container status \"a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d\": rpc error: code = NotFound desc = could not find container \"a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d\": container with ID starting with a7fc78551a9342d51dcf43583630959f6729932e312d0b2a660beef6e2520a3d not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.760296 4970 scope.go:117] "RemoveContainer" containerID="724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.760797 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1\": container with ID starting with 724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1 not found: ID does not exist" containerID="724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.760885 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1"} err="failed to get container status \"724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1\": rpc error: code = NotFound desc = could not find container \"724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1\": container with ID starting with 724d8855caf229a5c2bf2f89ca8513d44756c2e9e14982489cb8cce2217608c1 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.760960 4970 scope.go:117] "RemoveContainer" containerID="69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.771373 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a92f4929-dcab-4362-a5f2-c648f274bf04" (UID: "a92f4929-dcab-4362-a5f2-c648f274bf04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.774020 4970 scope.go:117] "RemoveContainer" containerID="8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.786789 4970 scope.go:117] "RemoveContainer" containerID="69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.788650 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf\": container with ID starting with 69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf not found: ID does not exist" containerID="69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.788735 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf"} err="failed to get container status \"69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf\": rpc error: code = NotFound desc = could not find container \"69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf\": container with ID starting with 69ef22519acaf3b357a25c50f3ae3e80401c3a4f71a795b0a33d9cdc4be82cbf not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.788770 4970 scope.go:117] "RemoveContainer" containerID="8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.789121 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28\": container with ID starting with 8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28 not found: ID does not exist" containerID="8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.789188 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28"} err="failed to get container status \"8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28\": rpc error: code = NotFound desc = could not find container \"8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28\": container with ID starting with 8f71c320f5c6711aaf58cf57698a53bb88acc22f7294690f3fd6abe7955cdb28 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.789248 4970 scope.go:117] "RemoveContainer" containerID="56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.803809 4970 scope.go:117] "RemoveContainer" containerID="0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.810173 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6lxld"] Nov 28 13:24:59 crc kubenswrapper[4970]: W1128 13:24:59.814236 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea9d8a5_d14f_4f9c_a800_815168dd799e.slice/crio-22e637b1343b71f19ae1deeed3a603f810bdadef45767cda47469651b740ab7d WatchSource:0}: Error finding container 22e637b1343b71f19ae1deeed3a603f810bdadef45767cda47469651b740ab7d: Status 404 returned error can't find the container with id 22e637b1343b71f19ae1deeed3a603f810bdadef45767cda47469651b740ab7d Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.824019 4970 scope.go:117] "RemoveContainer" containerID="d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.839314 4970 scope.go:117] "RemoveContainer" containerID="56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.839794 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45\": container with ID starting with 56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45 not found: ID does not exist" containerID="56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.839825 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45"} err="failed to get container status \"56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45\": rpc error: code = NotFound desc = could not find container \"56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45\": container with ID starting with 56aa5529733972838719518afcebdd586895c9f7e2d8edd529e9035247478a45 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.839848 4970 scope.go:117] "RemoveContainer" containerID="0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.840231 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413\": container with ID starting with 0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413 not found: ID does not exist" containerID="0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.840254 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413"} err="failed to get container status \"0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413\": rpc error: code = NotFound desc = could not find container \"0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413\": container with ID starting with 0eed0f215eb4f624a7447160993f2dc93fdcfce64131091866e58db660c3d413 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.840266 4970 scope.go:117] "RemoveContainer" containerID="d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.840519 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931\": container with ID starting with d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931 not found: ID does not exist" containerID="d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.840571 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931"} err="failed to get container status \"d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931\": rpc error: code = NotFound desc = could not find container \"d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931\": container with ID starting with d54ce830ae198678b7db53aadefa7d134348e1b99e4ab8a803f24bee3f263931 not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.840603 4970 scope.go:117] "RemoveContainer" containerID="61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.853432 4970 scope.go:117] "RemoveContainer" containerID="488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.859583 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92f4929-dcab-4362-a5f2-c648f274bf04-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.915567 4970 scope.go:117] "RemoveContainer" containerID="ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.949354 4970 scope.go:117] "RemoveContainer" containerID="61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.949774 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a\": container with ID starting with 61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a not found: ID does not exist" containerID="61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.949811 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a"} err="failed to get container status \"61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a\": rpc error: code = NotFound desc = could not find container \"61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a\": container with ID starting with 61ffead2ec33c64616686a0a1bf5037299cc6617b355521167580fe7f77e448a not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.949843 4970 scope.go:117] "RemoveContainer" containerID="488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.952361 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c\": container with ID starting with 488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c not found: ID does not exist" containerID="488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.952393 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c"} err="failed to get container status \"488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c\": rpc error: code = NotFound desc = could not find container \"488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c\": container with ID starting with 488e868e28f939a9fb5e1536ee9d110a27d69b0d667ea9fe850be4cee9b3139c not found: ID does not exist" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.952416 4970 scope.go:117] "RemoveContainer" containerID="ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228" Nov 28 13:24:59 crc kubenswrapper[4970]: E1128 13:24:59.953429 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228\": container with ID starting with ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228 not found: ID does not exist" containerID="ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228" Nov 28 13:24:59 crc kubenswrapper[4970]: I1128 13:24:59.953457 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228"} err="failed to get container status \"ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228\": rpc error: code = NotFound desc = could not find container \"ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228\": container with ID starting with ca9c04814cbeae9949fc3df08a9deba4cf5b237e7a041e34809909ade3306228 not found: ID does not exist" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.008112 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mwg2n"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.013636 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mwg2n"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.022374 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwr74"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.030247 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wwr74"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.041038 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hjm2"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.053516 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hjm2"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.061050 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nccsb"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.067755 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nccsb"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.666713 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" event={"ID":"cea9d8a5-d14f-4f9c-a800-815168dd799e","Type":"ContainerStarted","Data":"0740c2fab97d0a545b4f25e29cca8e359806d3599caed16cffab927197544f52"} Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.667044 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" event={"ID":"cea9d8a5-d14f-4f9c-a800-815168dd799e","Type":"ContainerStarted","Data":"22e637b1343b71f19ae1deeed3a603f810bdadef45767cda47469651b740ab7d"} Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.669189 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.670407 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.701311 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6lxld" podStartSLOduration=2.701287849 podStartE2EDuration="2.701287849s" podCreationTimestamp="2025-11-28 13:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:25:00.684522614 +0000 UTC m=+311.537404424" watchObservedRunningTime="2025-11-28 13:25:00.701287849 +0000 UTC m=+311.554169649" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732079 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbnmk"] Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732322 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732337 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732347 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732353 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732366 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732373 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732382 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732387 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732394 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732399 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732407 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732413 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732420 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732425 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732435 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732441 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732450 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732455 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732463 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732469 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732475 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732480 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732487 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732493 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732501 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732506 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="extract-content" Nov 28 13:25:00 crc kubenswrapper[4970]: E1128 13:25:00.732512 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732520 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="extract-utilities" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732601 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732611 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="619af67d-331c-4b38-b536-269ba823fd75" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732618 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" containerName="marketplace-operator" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732628 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732640 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf78924-9472-414e-baf6-822e511c464c" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.732649 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" containerName="registry-server" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.733279 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.735191 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.781434 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbnmk"] Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.871993 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2nq\" (UniqueName: \"kubernetes.io/projected/fcde0e22-6f82-4495-932f-e5e57f31d4f7-kube-api-access-wc2nq\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.872057 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcde0e22-6f82-4495-932f-e5e57f31d4f7-utilities\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.872087 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcde0e22-6f82-4495-932f-e5e57f31d4f7-catalog-content\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.973758 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2nq\" (UniqueName: \"kubernetes.io/projected/fcde0e22-6f82-4495-932f-e5e57f31d4f7-kube-api-access-wc2nq\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.973834 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcde0e22-6f82-4495-932f-e5e57f31d4f7-utilities\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.973885 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcde0e22-6f82-4495-932f-e5e57f31d4f7-catalog-content\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.974401 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcde0e22-6f82-4495-932f-e5e57f31d4f7-utilities\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.974575 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcde0e22-6f82-4495-932f-e5e57f31d4f7-catalog-content\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:00 crc kubenswrapper[4970]: I1128 13:25:00.994780 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2nq\" (UniqueName: \"kubernetes.io/projected/fcde0e22-6f82-4495-932f-e5e57f31d4f7-kube-api-access-wc2nq\") pod \"certified-operators-wbnmk\" (UID: \"fcde0e22-6f82-4495-932f-e5e57f31d4f7\") " pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.053100 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.391314 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514859c2-bd3c-4ccb-90b0-61180a1bc297" path="/var/lib/kubelet/pods/514859c2-bd3c-4ccb-90b0-61180a1bc297/volumes" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.391984 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619af67d-331c-4b38-b536-269ba823fd75" path="/var/lib/kubelet/pods/619af67d-331c-4b38-b536-269ba823fd75/volumes" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.392536 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92f4929-dcab-4362-a5f2-c648f274bf04" path="/var/lib/kubelet/pods/a92f4929-dcab-4362-a5f2-c648f274bf04/volumes" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.393098 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac87c2e3-5a6b-4998-8db7-165e571f6f52" path="/var/lib/kubelet/pods/ac87c2e3-5a6b-4998-8db7-165e571f6f52/volumes" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.394315 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf78924-9472-414e-baf6-822e511c464c" path="/var/lib/kubelet/pods/fdf78924-9472-414e-baf6-822e511c464c/volumes" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.483941 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbnmk"] Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.683286 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnmk" event={"ID":"fcde0e22-6f82-4495-932f-e5e57f31d4f7","Type":"ContainerStarted","Data":"db967d976ffaa093a897d2d970ed245d235284eeac39a4b97aac6548ab72ae24"} Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.741721 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9f2"] Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.742568 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.746243 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.746364 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9f2"] Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.889992 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wttq\" (UniqueName: \"kubernetes.io/projected/933ca994-f31b-4c5a-b068-8942618eb443-kube-api-access-4wttq\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.890030 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933ca994-f31b-4c5a-b068-8942618eb443-catalog-content\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.890075 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933ca994-f31b-4c5a-b068-8942618eb443-utilities\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.992013 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933ca994-f31b-4c5a-b068-8942618eb443-utilities\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.992169 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wttq\" (UniqueName: \"kubernetes.io/projected/933ca994-f31b-4c5a-b068-8942618eb443-kube-api-access-4wttq\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.992206 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933ca994-f31b-4c5a-b068-8942618eb443-catalog-content\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.993020 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933ca994-f31b-4c5a-b068-8942618eb443-utilities\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:01 crc kubenswrapper[4970]: I1128 13:25:01.993122 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933ca994-f31b-4c5a-b068-8942618eb443-catalog-content\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:02 crc kubenswrapper[4970]: I1128 13:25:02.029838 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wttq\" (UniqueName: \"kubernetes.io/projected/933ca994-f31b-4c5a-b068-8942618eb443-kube-api-access-4wttq\") pod \"redhat-marketplace-9f9f2\" (UID: \"933ca994-f31b-4c5a-b068-8942618eb443\") " pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:02 crc kubenswrapper[4970]: I1128 13:25:02.120355 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:02 crc kubenswrapper[4970]: I1128 13:25:02.545379 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9f9f2"] Nov 28 13:25:02 crc kubenswrapper[4970]: I1128 13:25:02.692157 4970 generic.go:334] "Generic (PLEG): container finished" podID="fcde0e22-6f82-4495-932f-e5e57f31d4f7" containerID="e4e26a64a6eb6975285669de0602582ea5aa8b12ceea9b4ee467da7fea1e9f1b" exitCode=0 Nov 28 13:25:02 crc kubenswrapper[4970]: I1128 13:25:02.692271 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnmk" event={"ID":"fcde0e22-6f82-4495-932f-e5e57f31d4f7","Type":"ContainerDied","Data":"e4e26a64a6eb6975285669de0602582ea5aa8b12ceea9b4ee467da7fea1e9f1b"} Nov 28 13:25:02 crc kubenswrapper[4970]: I1128 13:25:02.693509 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9f2" event={"ID":"933ca994-f31b-4c5a-b068-8942618eb443","Type":"ContainerStarted","Data":"945108aed8fa908503d9a9968f5a81aaefd80cebd6756e17015f48a05d790b1d"} Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.328820 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7zkq"] Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.330472 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.332895 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.338898 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7zkq"] Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.409975 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wml4f\" (UniqueName: \"kubernetes.io/projected/dd8f781f-7121-4875-adec-2318c2ecd8e2-kube-api-access-wml4f\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.410035 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8f781f-7121-4875-adec-2318c2ecd8e2-catalog-content\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.410082 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8f781f-7121-4875-adec-2318c2ecd8e2-utilities\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.510754 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8f781f-7121-4875-adec-2318c2ecd8e2-utilities\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.511176 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wml4f\" (UniqueName: \"kubernetes.io/projected/dd8f781f-7121-4875-adec-2318c2ecd8e2-kube-api-access-wml4f\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.511226 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8f781f-7121-4875-adec-2318c2ecd8e2-catalog-content\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.511302 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8f781f-7121-4875-adec-2318c2ecd8e2-utilities\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.511677 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8f781f-7121-4875-adec-2318c2ecd8e2-catalog-content\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.531988 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wml4f\" (UniqueName: \"kubernetes.io/projected/dd8f781f-7121-4875-adec-2318c2ecd8e2-kube-api-access-wml4f\") pod \"redhat-operators-z7zkq\" (UID: \"dd8f781f-7121-4875-adec-2318c2ecd8e2\") " pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.699140 4970 generic.go:334] "Generic (PLEG): container finished" podID="933ca994-f31b-4c5a-b068-8942618eb443" containerID="4c486520a71b8ac6f8729f3bf342d5a2c76adaf4435c0886d897ee760d04dadb" exitCode=0 Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.699179 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9f2" event={"ID":"933ca994-f31b-4c5a-b068-8942618eb443","Type":"ContainerDied","Data":"4c486520a71b8ac6f8729f3bf342d5a2c76adaf4435c0886d897ee760d04dadb"} Nov 28 13:25:03 crc kubenswrapper[4970]: I1128 13:25:03.702097 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.148258 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7zkq"] Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.534278 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dhrhm"] Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.535839 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.538518 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.545172 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhrhm"] Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.636284 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9cv\" (UniqueName: \"kubernetes.io/projected/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-kube-api-access-2c9cv\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.636562 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-catalog-content\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.637146 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-utilities\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.712504 4970 generic.go:334] "Generic (PLEG): container finished" podID="dd8f781f-7121-4875-adec-2318c2ecd8e2" containerID="fde8ff4ac635c181a498149870d115480409062ccd13f21c5115e74a7a5c0163" exitCode=0 Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.712605 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7zkq" event={"ID":"dd8f781f-7121-4875-adec-2318c2ecd8e2","Type":"ContainerDied","Data":"fde8ff4ac635c181a498149870d115480409062ccd13f21c5115e74a7a5c0163"} Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.712638 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7zkq" event={"ID":"dd8f781f-7121-4875-adec-2318c2ecd8e2","Type":"ContainerStarted","Data":"83d97f01ef72502dc8d5129874c8724673631a967a278f77f407912449ffe830"} Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.715303 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9f2" event={"ID":"933ca994-f31b-4c5a-b068-8942618eb443","Type":"ContainerStarted","Data":"674b75eba3baad2ab8bb29b0fa474b0d61bf74e03e6f293a2136143e49b07242"} Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.718086 4970 generic.go:334] "Generic (PLEG): container finished" podID="fcde0e22-6f82-4495-932f-e5e57f31d4f7" containerID="a7f1c3c07938931ce723ee384b207055c9300e92be5f313ea24d74ca956aca0f" exitCode=0 Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.718122 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnmk" event={"ID":"fcde0e22-6f82-4495-932f-e5e57f31d4f7","Type":"ContainerDied","Data":"a7f1c3c07938931ce723ee384b207055c9300e92be5f313ea24d74ca956aca0f"} Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.738543 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-catalog-content\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.738607 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-utilities\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.738661 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9cv\" (UniqueName: \"kubernetes.io/projected/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-kube-api-access-2c9cv\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.739032 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-catalog-content\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.739237 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-utilities\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.760288 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9cv\" (UniqueName: \"kubernetes.io/projected/bda49097-ef3b-4e2f-8f8c-cb54ea0818b7-kube-api-access-2c9cv\") pod \"community-operators-dhrhm\" (UID: \"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7\") " pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:04 crc kubenswrapper[4970]: I1128 13:25:04.879000 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.295080 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dhrhm"] Nov 28 13:25:05 crc kubenswrapper[4970]: W1128 13:25:05.315049 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda49097_ef3b_4e2f_8f8c_cb54ea0818b7.slice/crio-b5bfc4d9a2e0302241c50d5cdf9af312807a6a202dd2bcdcc42eba686a8ccf47 WatchSource:0}: Error finding container b5bfc4d9a2e0302241c50d5cdf9af312807a6a202dd2bcdcc42eba686a8ccf47: Status 404 returned error can't find the container with id b5bfc4d9a2e0302241c50d5cdf9af312807a6a202dd2bcdcc42eba686a8ccf47 Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.725434 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7zkq" event={"ID":"dd8f781f-7121-4875-adec-2318c2ecd8e2","Type":"ContainerStarted","Data":"647758d717b4d3ad0b43db497516c05785b220cba46328f5329dde80fead8ca7"} Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.727623 4970 generic.go:334] "Generic (PLEG): container finished" podID="933ca994-f31b-4c5a-b068-8942618eb443" containerID="674b75eba3baad2ab8bb29b0fa474b0d61bf74e03e6f293a2136143e49b07242" exitCode=0 Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.727679 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9f2" event={"ID":"933ca994-f31b-4c5a-b068-8942618eb443","Type":"ContainerDied","Data":"674b75eba3baad2ab8bb29b0fa474b0d61bf74e03e6f293a2136143e49b07242"} Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.732595 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnmk" event={"ID":"fcde0e22-6f82-4495-932f-e5e57f31d4f7","Type":"ContainerStarted","Data":"a1609be9f8e0aa4c2875f47a0e366ac443b9dc5c6e0996a99ef7f92149505165"} Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.739636 4970 generic.go:334] "Generic (PLEG): container finished" podID="bda49097-ef3b-4e2f-8f8c-cb54ea0818b7" containerID="a691ecff05323c204a55e65a4faaec0e7542c5c3c64ed86289a694fe7167cb2a" exitCode=0 Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.739754 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhrhm" event={"ID":"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7","Type":"ContainerDied","Data":"a691ecff05323c204a55e65a4faaec0e7542c5c3c64ed86289a694fe7167cb2a"} Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.739812 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhrhm" event={"ID":"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7","Type":"ContainerStarted","Data":"b5bfc4d9a2e0302241c50d5cdf9af312807a6a202dd2bcdcc42eba686a8ccf47"} Nov 28 13:25:05 crc kubenswrapper[4970]: I1128 13:25:05.759734 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbnmk" podStartSLOduration=3.307328141 podStartE2EDuration="5.759716379s" podCreationTimestamp="2025-11-28 13:25:00 +0000 UTC" firstStartedPulling="2025-11-28 13:25:02.694463928 +0000 UTC m=+313.547345738" lastFinishedPulling="2025-11-28 13:25:05.146852176 +0000 UTC m=+315.999733976" observedRunningTime="2025-11-28 13:25:05.757009868 +0000 UTC m=+316.609891668" watchObservedRunningTime="2025-11-28 13:25:05.759716379 +0000 UTC m=+316.612598179" Nov 28 13:25:06 crc kubenswrapper[4970]: I1128 13:25:06.745067 4970 generic.go:334] "Generic (PLEG): container finished" podID="dd8f781f-7121-4875-adec-2318c2ecd8e2" containerID="647758d717b4d3ad0b43db497516c05785b220cba46328f5329dde80fead8ca7" exitCode=0 Nov 28 13:25:06 crc kubenswrapper[4970]: I1128 13:25:06.745291 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7zkq" event={"ID":"dd8f781f-7121-4875-adec-2318c2ecd8e2","Type":"ContainerDied","Data":"647758d717b4d3ad0b43db497516c05785b220cba46328f5329dde80fead8ca7"} Nov 28 13:25:06 crc kubenswrapper[4970]: I1128 13:25:06.748975 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9f9f2" event={"ID":"933ca994-f31b-4c5a-b068-8942618eb443","Type":"ContainerStarted","Data":"4189f61fa9e98958980a1f9e0bb774d62030f2462dc31f19cbffad341796dac4"} Nov 28 13:25:06 crc kubenswrapper[4970]: I1128 13:25:06.751476 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhrhm" event={"ID":"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7","Type":"ContainerStarted","Data":"430a86deb201c8eeaf125859a70253320a238844117c9db008f5b4751ddfbdd2"} Nov 28 13:25:06 crc kubenswrapper[4970]: I1128 13:25:06.803705 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9f9f2" podStartSLOduration=3.114126699 podStartE2EDuration="5.803691815s" podCreationTimestamp="2025-11-28 13:25:01 +0000 UTC" firstStartedPulling="2025-11-28 13:25:03.704977647 +0000 UTC m=+314.557859447" lastFinishedPulling="2025-11-28 13:25:06.394542763 +0000 UTC m=+317.247424563" observedRunningTime="2025-11-28 13:25:06.800573072 +0000 UTC m=+317.653454872" watchObservedRunningTime="2025-11-28 13:25:06.803691815 +0000 UTC m=+317.656573615" Nov 28 13:25:07 crc kubenswrapper[4970]: I1128 13:25:07.758712 4970 generic.go:334] "Generic (PLEG): container finished" podID="bda49097-ef3b-4e2f-8f8c-cb54ea0818b7" containerID="430a86deb201c8eeaf125859a70253320a238844117c9db008f5b4751ddfbdd2" exitCode=0 Nov 28 13:25:07 crc kubenswrapper[4970]: I1128 13:25:07.758759 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhrhm" event={"ID":"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7","Type":"ContainerDied","Data":"430a86deb201c8eeaf125859a70253320a238844117c9db008f5b4751ddfbdd2"} Nov 28 13:25:09 crc kubenswrapper[4970]: I1128 13:25:09.776503 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dhrhm" event={"ID":"bda49097-ef3b-4e2f-8f8c-cb54ea0818b7","Type":"ContainerStarted","Data":"bcfa88b56a6a26b7e5cdfce5846312322917533a9afe62088784282b0981aec6"} Nov 28 13:25:09 crc kubenswrapper[4970]: I1128 13:25:09.778545 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7zkq" event={"ID":"dd8f781f-7121-4875-adec-2318c2ecd8e2","Type":"ContainerStarted","Data":"613857dc1751aa895bba1f41c31c43a370d289a14e5b6babd964fa9c52bed52c"} Nov 28 13:25:09 crc kubenswrapper[4970]: I1128 13:25:09.791907 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dhrhm" podStartSLOduration=2.545256629 podStartE2EDuration="5.791885748s" podCreationTimestamp="2025-11-28 13:25:04 +0000 UTC" firstStartedPulling="2025-11-28 13:25:05.741369307 +0000 UTC m=+316.594251097" lastFinishedPulling="2025-11-28 13:25:08.987998416 +0000 UTC m=+319.840880216" observedRunningTime="2025-11-28 13:25:09.791793145 +0000 UTC m=+320.644674965" watchObservedRunningTime="2025-11-28 13:25:09.791885748 +0000 UTC m=+320.644767548" Nov 28 13:25:09 crc kubenswrapper[4970]: I1128 13:25:09.811324 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7zkq" podStartSLOduration=4.324237969 podStartE2EDuration="6.811305082s" podCreationTimestamp="2025-11-28 13:25:03 +0000 UTC" firstStartedPulling="2025-11-28 13:25:04.713778464 +0000 UTC m=+315.566660264" lastFinishedPulling="2025-11-28 13:25:07.200845577 +0000 UTC m=+318.053727377" observedRunningTime="2025-11-28 13:25:09.807296931 +0000 UTC m=+320.660178731" watchObservedRunningTime="2025-11-28 13:25:09.811305082 +0000 UTC m=+320.664186892" Nov 28 13:25:11 crc kubenswrapper[4970]: I1128 13:25:11.053866 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:11 crc kubenswrapper[4970]: I1128 13:25:11.054707 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:11 crc kubenswrapper[4970]: I1128 13:25:11.094071 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:11 crc kubenswrapper[4970]: I1128 13:25:11.827167 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbnmk" Nov 28 13:25:12 crc kubenswrapper[4970]: I1128 13:25:12.121438 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:12 crc kubenswrapper[4970]: I1128 13:25:12.121525 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:12 crc kubenswrapper[4970]: I1128 13:25:12.164453 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:12 crc kubenswrapper[4970]: I1128 13:25:12.832934 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9f9f2" Nov 28 13:25:13 crc kubenswrapper[4970]: I1128 13:25:13.703590 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:13 crc kubenswrapper[4970]: I1128 13:25:13.703869 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:14 crc kubenswrapper[4970]: I1128 13:25:14.744949 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7zkq" podUID="dd8f781f-7121-4875-adec-2318c2ecd8e2" containerName="registry-server" probeResult="failure" output=< Nov 28 13:25:14 crc kubenswrapper[4970]: timeout: failed to connect service ":50051" within 1s Nov 28 13:25:14 crc kubenswrapper[4970]: > Nov 28 13:25:14 crc kubenswrapper[4970]: I1128 13:25:14.880053 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:14 crc kubenswrapper[4970]: I1128 13:25:14.880103 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:14 crc kubenswrapper[4970]: I1128 13:25:14.918661 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:15 crc kubenswrapper[4970]: I1128 13:25:15.842139 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dhrhm" Nov 28 13:25:16 crc kubenswrapper[4970]: I1128 13:25:16.700721 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zmjh4" Nov 28 13:25:16 crc kubenswrapper[4970]: I1128 13:25:16.716810 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-574c848897-sqzn9"] Nov 28 13:25:16 crc kubenswrapper[4970]: I1128 13:25:16.717200 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" podUID="a4c98945-289e-4bad-ae74-d0a4feada930" containerName="controller-manager" containerID="cri-o://d1641923deb8c34d166781b8fd3e9cdd0c4ec1f0439a97161948a9568ee5082d" gracePeriod=30 Nov 28 13:25:16 crc kubenswrapper[4970]: I1128 13:25:16.782733 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm"] Nov 28 13:25:16 crc kubenswrapper[4970]: I1128 13:25:16.783202 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" podUID="4369b978-7720-433e-ba2a-ef658c76b0b2" containerName="route-controller-manager" containerID="cri-o://5dc9690a7db96736a9c9fd1b5870947b048d3403f7810882625e37bb5e25d7ff" gracePeriod=30 Nov 28 13:25:16 crc kubenswrapper[4970]: I1128 13:25:16.814024 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjj4p"] Nov 28 13:25:18 crc kubenswrapper[4970]: I1128 13:25:18.707667 4970 patch_prober.go:28] interesting pod/route-controller-manager-657fdc8645-zvsmm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Nov 28 13:25:18 crc kubenswrapper[4970]: I1128 13:25:18.707774 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" podUID="4369b978-7720-433e-ba2a-ef658c76b0b2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Nov 28 13:25:18 crc kubenswrapper[4970]: I1128 13:25:18.722296 4970 patch_prober.go:28] interesting pod/controller-manager-574c848897-sqzn9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Nov 28 13:25:18 crc kubenswrapper[4970]: I1128 13:25:18.722379 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" podUID="a4c98945-289e-4bad-ae74-d0a4feada930" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Nov 28 13:25:18 crc kubenswrapper[4970]: I1128 13:25:18.826332 4970 generic.go:334] "Generic (PLEG): container finished" podID="a4c98945-289e-4bad-ae74-d0a4feada930" containerID="d1641923deb8c34d166781b8fd3e9cdd0c4ec1f0439a97161948a9568ee5082d" exitCode=0 Nov 28 13:25:18 crc kubenswrapper[4970]: I1128 13:25:18.826409 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" event={"ID":"a4c98945-289e-4bad-ae74-d0a4feada930","Type":"ContainerDied","Data":"d1641923deb8c34d166781b8fd3e9cdd0c4ec1f0439a97161948a9568ee5082d"} Nov 28 13:25:20 crc kubenswrapper[4970]: I1128 13:25:20.837558 4970 generic.go:334] "Generic (PLEG): container finished" podID="4369b978-7720-433e-ba2a-ef658c76b0b2" containerID="5dc9690a7db96736a9c9fd1b5870947b048d3403f7810882625e37bb5e25d7ff" exitCode=0 Nov 28 13:25:20 crc kubenswrapper[4970]: I1128 13:25:20.837722 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" event={"ID":"4369b978-7720-433e-ba2a-ef658c76b0b2","Type":"ContainerDied","Data":"5dc9690a7db96736a9c9fd1b5870947b048d3403f7810882625e37bb5e25d7ff"} Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.254121 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.291769 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r"] Nov 28 13:25:21 crc kubenswrapper[4970]: E1128 13:25:21.292077 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c98945-289e-4bad-ae74-d0a4feada930" containerName="controller-manager" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.292092 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c98945-289e-4bad-ae74-d0a4feada930" containerName="controller-manager" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.292470 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c98945-289e-4bad-ae74-d0a4feada930" containerName="controller-manager" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.293022 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.295814 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r"] Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.444445 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6csst\" (UniqueName: \"kubernetes.io/projected/a4c98945-289e-4bad-ae74-d0a4feada930-kube-api-access-6csst\") pod \"a4c98945-289e-4bad-ae74-d0a4feada930\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.444738 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-client-ca\") pod \"a4c98945-289e-4bad-ae74-d0a4feada930\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.444765 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-config\") pod \"a4c98945-289e-4bad-ae74-d0a4feada930\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.444806 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c98945-289e-4bad-ae74-d0a4feada930-serving-cert\") pod \"a4c98945-289e-4bad-ae74-d0a4feada930\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.444884 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-proxy-ca-bundles\") pod \"a4c98945-289e-4bad-ae74-d0a4feada930\" (UID: \"a4c98945-289e-4bad-ae74-d0a4feada930\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.445036 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-client-ca\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.445062 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-config\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.445099 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v5r\" (UniqueName: \"kubernetes.io/projected/78c697da-65a5-49c7-b38c-cc8fad4f5432-kube-api-access-p4v5r\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.445143 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-proxy-ca-bundles\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.445165 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c697da-65a5-49c7-b38c-cc8fad4f5432-serving-cert\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.445960 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-client-ca" (OuterVolumeSpecName: "client-ca") pod "a4c98945-289e-4bad-ae74-d0a4feada930" (UID: "a4c98945-289e-4bad-ae74-d0a4feada930"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.446285 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-config" (OuterVolumeSpecName: "config") pod "a4c98945-289e-4bad-ae74-d0a4feada930" (UID: "a4c98945-289e-4bad-ae74-d0a4feada930"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.446668 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a4c98945-289e-4bad-ae74-d0a4feada930" (UID: "a4c98945-289e-4bad-ae74-d0a4feada930"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.456934 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c98945-289e-4bad-ae74-d0a4feada930-kube-api-access-6csst" (OuterVolumeSpecName: "kube-api-access-6csst") pod "a4c98945-289e-4bad-ae74-d0a4feada930" (UID: "a4c98945-289e-4bad-ae74-d0a4feada930"). InnerVolumeSpecName "kube-api-access-6csst". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.460598 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c98945-289e-4bad-ae74-d0a4feada930-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a4c98945-289e-4bad-ae74-d0a4feada930" (UID: "a4c98945-289e-4bad-ae74-d0a4feada930"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.532762 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546181 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v5r\" (UniqueName: \"kubernetes.io/projected/78c697da-65a5-49c7-b38c-cc8fad4f5432-kube-api-access-p4v5r\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546345 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-proxy-ca-bundles\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546394 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c697da-65a5-49c7-b38c-cc8fad4f5432-serving-cert\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546501 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-client-ca\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546643 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-config\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546810 4970 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546836 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546854 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c98945-289e-4bad-ae74-d0a4feada930-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546874 4970 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4c98945-289e-4bad-ae74-d0a4feada930-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.546893 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6csst\" (UniqueName: \"kubernetes.io/projected/a4c98945-289e-4bad-ae74-d0a4feada930-kube-api-access-6csst\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.547887 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-proxy-ca-bundles\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.548372 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-config\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.548416 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78c697da-65a5-49c7-b38c-cc8fad4f5432-client-ca\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.550118 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78c697da-65a5-49c7-b38c-cc8fad4f5432-serving-cert\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.566936 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v5r\" (UniqueName: \"kubernetes.io/projected/78c697da-65a5-49c7-b38c-cc8fad4f5432-kube-api-access-p4v5r\") pod \"controller-manager-86ddfdfb8c-rk57r\" (UID: \"78c697da-65a5-49c7-b38c-cc8fad4f5432\") " pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.614364 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.647544 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4gnr\" (UniqueName: \"kubernetes.io/projected/4369b978-7720-433e-ba2a-ef658c76b0b2-kube-api-access-l4gnr\") pod \"4369b978-7720-433e-ba2a-ef658c76b0b2\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.647619 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-config\") pod \"4369b978-7720-433e-ba2a-ef658c76b0b2\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.647652 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-client-ca\") pod \"4369b978-7720-433e-ba2a-ef658c76b0b2\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.647718 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4369b978-7720-433e-ba2a-ef658c76b0b2-serving-cert\") pod \"4369b978-7720-433e-ba2a-ef658c76b0b2\" (UID: \"4369b978-7720-433e-ba2a-ef658c76b0b2\") " Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.648333 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-client-ca" (OuterVolumeSpecName: "client-ca") pod "4369b978-7720-433e-ba2a-ef658c76b0b2" (UID: "4369b978-7720-433e-ba2a-ef658c76b0b2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.648628 4970 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-client-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.648721 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-config" (OuterVolumeSpecName: "config") pod "4369b978-7720-433e-ba2a-ef658c76b0b2" (UID: "4369b978-7720-433e-ba2a-ef658c76b0b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.650686 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4369b978-7720-433e-ba2a-ef658c76b0b2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4369b978-7720-433e-ba2a-ef658c76b0b2" (UID: "4369b978-7720-433e-ba2a-ef658c76b0b2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.650763 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4369b978-7720-433e-ba2a-ef658c76b0b2-kube-api-access-l4gnr" (OuterVolumeSpecName: "kube-api-access-l4gnr") pod "4369b978-7720-433e-ba2a-ef658c76b0b2" (UID: "4369b978-7720-433e-ba2a-ef658c76b0b2"). InnerVolumeSpecName "kube-api-access-l4gnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.750483 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4gnr\" (UniqueName: \"kubernetes.io/projected/4369b978-7720-433e-ba2a-ef658c76b0b2-kube-api-access-l4gnr\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.750524 4970 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4369b978-7720-433e-ba2a-ef658c76b0b2-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.750540 4970 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4369b978-7720-433e-ba2a-ef658c76b0b2-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.816532 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r"] Nov 28 13:25:21 crc kubenswrapper[4970]: W1128 13:25:21.827549 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c697da_65a5_49c7_b38c_cc8fad4f5432.slice/crio-a3307b455c04c953e0161703916706af7494be801a37c17242a25c25f951fc2c WatchSource:0}: Error finding container a3307b455c04c953e0161703916706af7494be801a37c17242a25c25f951fc2c: Status 404 returned error can't find the container with id a3307b455c04c953e0161703916706af7494be801a37c17242a25c25f951fc2c Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.844514 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" event={"ID":"78c697da-65a5-49c7-b38c-cc8fad4f5432","Type":"ContainerStarted","Data":"a3307b455c04c953e0161703916706af7494be801a37c17242a25c25f951fc2c"} Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.845898 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" event={"ID":"4369b978-7720-433e-ba2a-ef658c76b0b2","Type":"ContainerDied","Data":"884865bd2c0eae9795b57ff7486cc8912912d5b68651ef4a07c3a4b24f3acc08"} Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.846202 4970 scope.go:117] "RemoveContainer" containerID="5dc9690a7db96736a9c9fd1b5870947b048d3403f7810882625e37bb5e25d7ff" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.846577 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.847107 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" event={"ID":"a4c98945-289e-4bad-ae74-d0a4feada930","Type":"ContainerDied","Data":"0530b5c7ef19f10c6ffc26ea83f9cba094dadaf2c95cb8d7044ce5b0172a4054"} Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.847168 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574c848897-sqzn9" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.863153 4970 scope.go:117] "RemoveContainer" containerID="d1641923deb8c34d166781b8fd3e9cdd0c4ec1f0439a97161948a9568ee5082d" Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.898324 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-574c848897-sqzn9"] Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.903806 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-574c848897-sqzn9"] Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.907178 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm"] Nov 28 13:25:21 crc kubenswrapper[4970]: I1128 13:25:21.911028 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-657fdc8645-zvsmm"] Nov 28 13:25:22 crc kubenswrapper[4970]: I1128 13:25:22.853311 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" event={"ID":"78c697da-65a5-49c7-b38c-cc8fad4f5432","Type":"ContainerStarted","Data":"5a08a2421a2cc78941ea052ea1f74adc4116b8abf6aeb6e93f4fe4b64c3215cc"} Nov 28 13:25:22 crc kubenswrapper[4970]: I1128 13:25:22.854486 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:22 crc kubenswrapper[4970]: I1128 13:25:22.859590 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" Nov 28 13:25:22 crc kubenswrapper[4970]: I1128 13:25:22.884363 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86ddfdfb8c-rk57r" podStartSLOduration=6.884345803 podStartE2EDuration="6.884345803s" podCreationTimestamp="2025-11-28 13:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:25:22.882923953 +0000 UTC m=+333.735805793" watchObservedRunningTime="2025-11-28 13:25:22.884345803 +0000 UTC m=+333.737227603" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.393726 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4369b978-7720-433e-ba2a-ef658c76b0b2" path="/var/lib/kubelet/pods/4369b978-7720-433e-ba2a-ef658c76b0b2/volumes" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.395592 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c98945-289e-4bad-ae74-d0a4feada930" path="/var/lib/kubelet/pods/a4c98945-289e-4bad-ae74-d0a4feada930/volumes" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.419662 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw"] Nov 28 13:25:23 crc kubenswrapper[4970]: E1128 13:25:23.420047 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4369b978-7720-433e-ba2a-ef658c76b0b2" containerName="route-controller-manager" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.420082 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369b978-7720-433e-ba2a-ef658c76b0b2" containerName="route-controller-manager" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.420333 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4369b978-7720-433e-ba2a-ef658c76b0b2" containerName="route-controller-manager" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.421074 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.425308 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.427014 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.427397 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.427399 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.427671 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.427711 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.430872 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw"] Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.480023 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0db42e0-566c-42d3-a107-745ca58a5b1b-serving-cert\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.480478 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0db42e0-566c-42d3-a107-745ca58a5b1b-config\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.480645 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddn8q\" (UniqueName: \"kubernetes.io/projected/c0db42e0-566c-42d3-a107-745ca58a5b1b-kube-api-access-ddn8q\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.480722 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0db42e0-566c-42d3-a107-745ca58a5b1b-client-ca\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.582156 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0db42e0-566c-42d3-a107-745ca58a5b1b-client-ca\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.582566 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0db42e0-566c-42d3-a107-745ca58a5b1b-serving-cert\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.582661 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0db42e0-566c-42d3-a107-745ca58a5b1b-config\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.582712 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddn8q\" (UniqueName: \"kubernetes.io/projected/c0db42e0-566c-42d3-a107-745ca58a5b1b-kube-api-access-ddn8q\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.584402 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0db42e0-566c-42d3-a107-745ca58a5b1b-config\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.584587 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0db42e0-566c-42d3-a107-745ca58a5b1b-client-ca\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.595483 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0db42e0-566c-42d3-a107-745ca58a5b1b-serving-cert\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.605393 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddn8q\" (UniqueName: \"kubernetes.io/projected/c0db42e0-566c-42d3-a107-745ca58a5b1b-kube-api-access-ddn8q\") pod \"route-controller-manager-6d47b8966f-jfkvw\" (UID: \"c0db42e0-566c-42d3-a107-745ca58a5b1b\") " pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.754353 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.782612 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:23 crc kubenswrapper[4970]: I1128 13:25:23.847204 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7zkq" Nov 28 13:25:24 crc kubenswrapper[4970]: I1128 13:25:24.171798 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw"] Nov 28 13:25:24 crc kubenswrapper[4970]: W1128 13:25:24.174317 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0db42e0_566c_42d3_a107_745ca58a5b1b.slice/crio-5395e1da8f83bd4ef6c86f785ed7b8c6865f68bea22166c8c022af8cf0392c30 WatchSource:0}: Error finding container 5395e1da8f83bd4ef6c86f785ed7b8c6865f68bea22166c8c022af8cf0392c30: Status 404 returned error can't find the container with id 5395e1da8f83bd4ef6c86f785ed7b8c6865f68bea22166c8c022af8cf0392c30 Nov 28 13:25:24 crc kubenswrapper[4970]: I1128 13:25:24.876615 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" event={"ID":"c0db42e0-566c-42d3-a107-745ca58a5b1b","Type":"ContainerStarted","Data":"940c14f905315a5f426ae495233dfbe1c7cfd2d554986578c156ee1878493cd0"} Nov 28 13:25:24 crc kubenswrapper[4970]: I1128 13:25:24.877021 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" event={"ID":"c0db42e0-566c-42d3-a107-745ca58a5b1b","Type":"ContainerStarted","Data":"5395e1da8f83bd4ef6c86f785ed7b8c6865f68bea22166c8c022af8cf0392c30"} Nov 28 13:25:24 crc kubenswrapper[4970]: I1128 13:25:24.897004 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" podStartSLOduration=8.896983823 podStartE2EDuration="8.896983823s" podCreationTimestamp="2025-11-28 13:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:25:24.895840101 +0000 UTC m=+335.748721901" watchObservedRunningTime="2025-11-28 13:25:24.896983823 +0000 UTC m=+335.749865623" Nov 28 13:25:25 crc kubenswrapper[4970]: I1128 13:25:25.883109 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:25 crc kubenswrapper[4970]: I1128 13:25:25.891855 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d47b8966f-jfkvw" Nov 28 13:25:41 crc kubenswrapper[4970]: I1128 13:25:41.854719 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" podUID="309db78c-54d0-452b-8b62-979217816260" containerName="registry" containerID="cri-o://58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a" gracePeriod=30 Nov 28 13:25:42 crc kubenswrapper[4970]: I1128 13:25:42.895324 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.026901 4970 generic.go:334] "Generic (PLEG): container finished" podID="309db78c-54d0-452b-8b62-979217816260" containerID="58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a" exitCode=0 Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.026938 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.026964 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" event={"ID":"309db78c-54d0-452b-8b62-979217816260","Type":"ContainerDied","Data":"58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a"} Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.028149 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjj4p" event={"ID":"309db78c-54d0-452b-8b62-979217816260","Type":"ContainerDied","Data":"de9d7f5e240df14e268e2a1b1bfab21dc4f1e2f9cfe1b777dba65eb4a8825fd0"} Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.028203 4970 scope.go:117] "RemoveContainer" containerID="58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.056330 4970 scope.go:117] "RemoveContainer" containerID="58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a" Nov 28 13:25:43 crc kubenswrapper[4970]: E1128 13:25:43.056903 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a\": container with ID starting with 58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a not found: ID does not exist" containerID="58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.056953 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a"} err="failed to get container status \"58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a\": rpc error: code = NotFound desc = could not find container \"58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a\": container with ID starting with 58260184f068b7112b9d37999434bdfd071637de831c2bf10e1367c0412d414a not found: ID does not exist" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.059946 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-registry-tls\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060384 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060447 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-trusted-ca\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060498 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309db78c-54d0-452b-8b62-979217816260-installation-pull-secrets\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060587 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309db78c-54d0-452b-8b62-979217816260-ca-trust-extracted\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060625 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-bound-sa-token\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060720 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6sv2\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-kube-api-access-k6sv2\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.060758 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-registry-certificates\") pod \"309db78c-54d0-452b-8b62-979217816260\" (UID: \"309db78c-54d0-452b-8b62-979217816260\") " Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.061712 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.061900 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.068927 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.069668 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/309db78c-54d0-452b-8b62-979217816260-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.069871 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.070109 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-kube-api-access-k6sv2" (OuterVolumeSpecName: "kube-api-access-k6sv2") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "kube-api-access-k6sv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.075350 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.087872 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/309db78c-54d0-452b-8b62-979217816260-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "309db78c-54d0-452b-8b62-979217816260" (UID: "309db78c-54d0-452b-8b62-979217816260"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162259 4970 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162321 4970 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162372 4970 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/309db78c-54d0-452b-8b62-979217816260-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162393 4970 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/309db78c-54d0-452b-8b62-979217816260-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162411 4970 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162430 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6sv2\" (UniqueName: \"kubernetes.io/projected/309db78c-54d0-452b-8b62-979217816260-kube-api-access-k6sv2\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.162446 4970 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/309db78c-54d0-452b-8b62-979217816260-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.362325 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjj4p"] Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.368306 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjj4p"] Nov 28 13:25:43 crc kubenswrapper[4970]: I1128 13:25:43.388607 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309db78c-54d0-452b-8b62-979217816260" path="/var/lib/kubelet/pods/309db78c-54d0-452b-8b62-979217816260/volumes" Nov 28 13:25:51 crc kubenswrapper[4970]: I1128 13:25:51.333600 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:25:51 crc kubenswrapper[4970]: I1128 13:25:51.334073 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:26:21 crc kubenswrapper[4970]: I1128 13:26:21.334172 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:26:21 crc kubenswrapper[4970]: I1128 13:26:21.334990 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.333058 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.333661 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.333713 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.334229 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37f88618e3e0c64d996d73cc9caf51cfb18f91db50c0f6d5a80a21593f745369"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.334320 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://37f88618e3e0c64d996d73cc9caf51cfb18f91db50c0f6d5a80a21593f745369" gracePeriod=600 Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.472984 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="37f88618e3e0c64d996d73cc9caf51cfb18f91db50c0f6d5a80a21593f745369" exitCode=0 Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.473049 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"37f88618e3e0c64d996d73cc9caf51cfb18f91db50c0f6d5a80a21593f745369"} Nov 28 13:26:51 crc kubenswrapper[4970]: I1128 13:26:51.473096 4970 scope.go:117] "RemoveContainer" containerID="fc9b6fc184f5dc3ba36a264ad6b3b87d8306222016e8b9eab63d75530062a2bd" Nov 28 13:26:52 crc kubenswrapper[4970]: I1128 13:26:52.480008 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"1c9fff9237be9f51a58f9e7060799a552c40dc5fcc7d7c71f57ed50492cd23cf"} Nov 28 13:28:51 crc kubenswrapper[4970]: I1128 13:28:51.333942 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:28:51 crc kubenswrapper[4970]: I1128 13:28:51.334765 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:29:21 crc kubenswrapper[4970]: I1128 13:29:21.333541 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:29:21 crc kubenswrapper[4970]: I1128 13:29:21.334378 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.333838 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.334530 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.334594 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.335559 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c9fff9237be9f51a58f9e7060799a552c40dc5fcc7d7c71f57ed50492cd23cf"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.335645 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://1c9fff9237be9f51a58f9e7060799a552c40dc5fcc7d7c71f57ed50492cd23cf" gracePeriod=600 Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.653561 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="1c9fff9237be9f51a58f9e7060799a552c40dc5fcc7d7c71f57ed50492cd23cf" exitCode=0 Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.653616 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"1c9fff9237be9f51a58f9e7060799a552c40dc5fcc7d7c71f57ed50492cd23cf"} Nov 28 13:29:51 crc kubenswrapper[4970]: I1128 13:29:51.653658 4970 scope.go:117] "RemoveContainer" containerID="37f88618e3e0c64d996d73cc9caf51cfb18f91db50c0f6d5a80a21593f745369" Nov 28 13:29:52 crc kubenswrapper[4970]: I1128 13:29:52.662018 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"86a03fe6c83c6ac3411e98ed1337717f0b27b46f31a13d39550e07889da6badd"} Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.209487 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll"] Nov 28 13:30:00 crc kubenswrapper[4970]: E1128 13:30:00.210383 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309db78c-54d0-452b-8b62-979217816260" containerName="registry" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.210400 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="309db78c-54d0-452b-8b62-979217816260" containerName="registry" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.210601 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="309db78c-54d0-452b-8b62-979217816260" containerName="registry" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.211149 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.214367 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.214910 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.222458 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll"] Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.317302 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c029aad0-6fb8-4bee-8e82-7dd60cad7054-config-volume\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.317446 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgrw\" (UniqueName: \"kubernetes.io/projected/c029aad0-6fb8-4bee-8e82-7dd60cad7054-kube-api-access-qjgrw\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.317494 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c029aad0-6fb8-4bee-8e82-7dd60cad7054-secret-volume\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.419192 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgrw\" (UniqueName: \"kubernetes.io/projected/c029aad0-6fb8-4bee-8e82-7dd60cad7054-kube-api-access-qjgrw\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.419304 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c029aad0-6fb8-4bee-8e82-7dd60cad7054-secret-volume\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.419394 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c029aad0-6fb8-4bee-8e82-7dd60cad7054-config-volume\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.420830 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c029aad0-6fb8-4bee-8e82-7dd60cad7054-config-volume\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.429100 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c029aad0-6fb8-4bee-8e82-7dd60cad7054-secret-volume\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.453335 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgrw\" (UniqueName: \"kubernetes.io/projected/c029aad0-6fb8-4bee-8e82-7dd60cad7054-kube-api-access-qjgrw\") pod \"collect-profiles-29405610-dm8ll\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.540585 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:00 crc kubenswrapper[4970]: I1128 13:30:00.776121 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll"] Nov 28 13:30:01 crc kubenswrapper[4970]: I1128 13:30:01.711852 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" event={"ID":"c029aad0-6fb8-4bee-8e82-7dd60cad7054","Type":"ContainerStarted","Data":"a15884519ce909f42fabe87cc5769b058087289cec7934b037df7997b4d54e87"} Nov 28 13:30:01 crc kubenswrapper[4970]: I1128 13:30:01.712324 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" event={"ID":"c029aad0-6fb8-4bee-8e82-7dd60cad7054","Type":"ContainerStarted","Data":"60be44f9027416a3af0fcf6e7ff368f2aa15d68c926290954077b70992916c7f"} Nov 28 13:30:01 crc kubenswrapper[4970]: I1128 13:30:01.731358 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" podStartSLOduration=1.731337635 podStartE2EDuration="1.731337635s" podCreationTimestamp="2025-11-28 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:30:01.72976928 +0000 UTC m=+612.582651100" watchObservedRunningTime="2025-11-28 13:30:01.731337635 +0000 UTC m=+612.584219435" Nov 28 13:30:02 crc kubenswrapper[4970]: I1128 13:30:02.718667 4970 generic.go:334] "Generic (PLEG): container finished" podID="c029aad0-6fb8-4bee-8e82-7dd60cad7054" containerID="a15884519ce909f42fabe87cc5769b058087289cec7934b037df7997b4d54e87" exitCode=0 Nov 28 13:30:02 crc kubenswrapper[4970]: I1128 13:30:02.718730 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" event={"ID":"c029aad0-6fb8-4bee-8e82-7dd60cad7054","Type":"ContainerDied","Data":"a15884519ce909f42fabe87cc5769b058087289cec7934b037df7997b4d54e87"} Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.007903 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.169112 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjgrw\" (UniqueName: \"kubernetes.io/projected/c029aad0-6fb8-4bee-8e82-7dd60cad7054-kube-api-access-qjgrw\") pod \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.169387 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c029aad0-6fb8-4bee-8e82-7dd60cad7054-secret-volume\") pod \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.169467 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c029aad0-6fb8-4bee-8e82-7dd60cad7054-config-volume\") pod \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\" (UID: \"c029aad0-6fb8-4bee-8e82-7dd60cad7054\") " Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.170731 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c029aad0-6fb8-4bee-8e82-7dd60cad7054-config-volume" (OuterVolumeSpecName: "config-volume") pod "c029aad0-6fb8-4bee-8e82-7dd60cad7054" (UID: "c029aad0-6fb8-4bee-8e82-7dd60cad7054"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.175404 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c029aad0-6fb8-4bee-8e82-7dd60cad7054-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c029aad0-6fb8-4bee-8e82-7dd60cad7054" (UID: "c029aad0-6fb8-4bee-8e82-7dd60cad7054"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.176783 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c029aad0-6fb8-4bee-8e82-7dd60cad7054-kube-api-access-qjgrw" (OuterVolumeSpecName: "kube-api-access-qjgrw") pod "c029aad0-6fb8-4bee-8e82-7dd60cad7054" (UID: "c029aad0-6fb8-4bee-8e82-7dd60cad7054"). InnerVolumeSpecName "kube-api-access-qjgrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.270893 4970 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c029aad0-6fb8-4bee-8e82-7dd60cad7054-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.270932 4970 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c029aad0-6fb8-4bee-8e82-7dd60cad7054-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.270948 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjgrw\" (UniqueName: \"kubernetes.io/projected/c029aad0-6fb8-4bee-8e82-7dd60cad7054-kube-api-access-qjgrw\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.738135 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" event={"ID":"c029aad0-6fb8-4bee-8e82-7dd60cad7054","Type":"ContainerDied","Data":"60be44f9027416a3af0fcf6e7ff368f2aa15d68c926290954077b70992916c7f"} Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.738176 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60be44f9027416a3af0fcf6e7ff368f2aa15d68c926290954077b70992916c7f" Nov 28 13:30:04 crc kubenswrapper[4970]: I1128 13:30:04.738181 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405610-dm8ll" Nov 28 13:30:41 crc kubenswrapper[4970]: I1128 13:30:41.904274 4970 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.107558 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6c6s9"] Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.107899 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-controller" containerID="cri-o://f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.108017 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-node" containerID="cri-o://a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.107998 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="northd" containerID="cri-o://189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.108056 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-acl-logging" containerID="cri-o://3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.107957 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.110310 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="sbdb" containerID="cri-o://cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.108170 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="nbdb" containerID="cri-o://50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.147498 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovnkube-controller" containerID="cri-o://1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" gracePeriod=30 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.803165 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6s9_17474edc-f114-4ee6-b6bb-95b55f1731ac/ovn-acl-logging/0.log" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.803961 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6s9_17474edc-f114-4ee6-b6bb-95b55f1731ac/ovn-controller/0.log" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.804380 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868443 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mrsq"] Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868653 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kubecfg-setup" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868666 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kubecfg-setup" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868691 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="northd" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868698 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="northd" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868707 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="sbdb" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868715 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="sbdb" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868721 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovnkube-controller" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868728 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovnkube-controller" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868738 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="nbdb" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868745 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="nbdb" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868757 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c029aad0-6fb8-4bee-8e82-7dd60cad7054" containerName="collect-profiles" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868765 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="c029aad0-6fb8-4bee-8e82-7dd60cad7054" containerName="collect-profiles" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868777 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-ovn-metrics" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868783 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-ovn-metrics" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868792 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-node" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868799 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-node" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868808 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-controller" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868815 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-controller" Nov 28 13:30:42 crc kubenswrapper[4970]: E1128 13:30:42.868830 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-acl-logging" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868837 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-acl-logging" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868962 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-controller" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868974 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovnkube-controller" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868983 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-ovn-metrics" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.868995 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="c029aad0-6fb8-4bee-8e82-7dd60cad7054" containerName="collect-profiles" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.869002 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="nbdb" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.869012 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="northd" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.869021 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="kube-rbac-proxy-node" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.869032 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="sbdb" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.869042 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerName="ovn-acl-logging" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.870922 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886554 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-ovn\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886597 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-node-log\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886615 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-bin\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886644 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886703 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-node-log" (OuterVolumeSpecName: "node-log") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886727 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886832 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovn-node-metrics-cert\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886900 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886947 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-node-log\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.886982 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-ovn\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887011 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887103 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-cni-bin\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887140 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-env-overrides\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887174 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-cni-netd\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887304 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-systemd-units\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887356 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-run-netns\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887401 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-var-lib-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887429 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-etc-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887445 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-kubelet\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887466 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovnkube-config\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887513 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovnkube-script-lib\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887537 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gcn\" (UniqueName: \"kubernetes.io/projected/f28c398a-ea7e-4117-b7f9-fa78579062e1-kube-api-access-z8gcn\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887561 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-log-socket\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887588 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-systemd\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887626 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-slash\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887648 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887719 4970 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887732 4970 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-node-log\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.887743 4970 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.960818 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6s9_17474edc-f114-4ee6-b6bb-95b55f1731ac/ovn-acl-logging/0.log" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961268 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6s9_17474edc-f114-4ee6-b6bb-95b55f1731ac/ovn-controller/0.log" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961624 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" exitCode=0 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961646 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" exitCode=0 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961653 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" exitCode=0 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961661 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" exitCode=0 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961669 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" exitCode=0 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961676 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" exitCode=0 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961682 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" exitCode=143 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961689 4970 generic.go:334] "Generic (PLEG): container finished" podID="17474edc-f114-4ee6-b6bb-95b55f1731ac" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" exitCode=143 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961701 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961729 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961766 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961784 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961798 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961811 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961824 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961829 4970 scope.go:117] "RemoveContainer" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961838 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961853 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961859 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961870 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961880 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961889 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961895 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961902 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961908 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961916 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961921 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961927 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961933 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961942 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961952 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961960 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961967 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961975 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961981 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961988 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.961995 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962002 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962008 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962017 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6s9" event={"ID":"17474edc-f114-4ee6-b6bb-95b55f1731ac","Type":"ContainerDied","Data":"014e0aa873d6af808f944184cc135d7a3b56ce1c0ee28f1ffc24d53951b18e4a"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962029 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962037 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962045 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962052 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962059 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962066 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962072 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962080 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.962086 4970 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.963711 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-krtxh_ddf11f1e-5631-4329-9db4-b75fed094c5f/kube-multus/0.log" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.963781 4970 generic.go:334] "Generic (PLEG): container finished" podID="ddf11f1e-5631-4329-9db4-b75fed094c5f" containerID="24a7c4d8c833fe666f119406934eac445c54366b38731e68e743d0c8fd524617" exitCode=2 Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.963821 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-krtxh" event={"ID":"ddf11f1e-5631-4329-9db4-b75fed094c5f","Type":"ContainerDied","Data":"24a7c4d8c833fe666f119406934eac445c54366b38731e68e743d0c8fd524617"} Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.964579 4970 scope.go:117] "RemoveContainer" containerID="24a7c4d8c833fe666f119406934eac445c54366b38731e68e743d0c8fd524617" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988646 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-ovn-kubernetes\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988702 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-systemd\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988725 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-netd\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988741 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-kubelet\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988741 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988766 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-env-overrides\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988787 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988806 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988815 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-netns\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988839 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-log-socket\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988841 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988883 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-openvswitch\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988888 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988890 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988941 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988910 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-log-socket" (OuterVolumeSpecName: "log-socket") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988949 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.988915 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-etc-openvswitch\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989037 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsjj6\" (UniqueName: \"kubernetes.io/projected/17474edc-f114-4ee6-b6bb-95b55f1731ac-kube-api-access-gsjj6\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989066 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-script-lib\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989092 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-slash\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989126 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-systemd-units\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989152 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovn-node-metrics-cert\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989232 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-var-lib-openvswitch\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989254 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-config\") pod \"17474edc-f114-4ee6-b6bb-95b55f1731ac\" (UID: \"17474edc-f114-4ee6-b6bb-95b55f1731ac\") " Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989305 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989345 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989582 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-slash" (OuterVolumeSpecName: "host-slash") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989638 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989707 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-systemd\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989749 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989778 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-slash\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989810 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-systemd\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989811 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989850 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989884 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-slash\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989913 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovn-node-metrics-cert\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989944 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989967 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-node-log\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.989998 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-ovn\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990020 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990075 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-cni-bin\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990109 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-env-overrides\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990128 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-cni-netd\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990145 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-systemd-units\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990165 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-run-netns\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990193 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-var-lib-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990230 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-etc-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990199 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990252 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-kubelet\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990273 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovnkube-config\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990307 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-cni-bin\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990342 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovnkube-script-lib\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990365 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gcn\" (UniqueName: \"kubernetes.io/projected/f28c398a-ea7e-4117-b7f9-fa78579062e1-kube-api-access-z8gcn\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990380 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990422 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-log-socket\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990467 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-node-log\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990494 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-kubelet\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990514 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-run-ovn\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990537 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-cni-netd\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990561 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-systemd-units\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990560 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990583 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-host-run-netns\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990629 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-var-lib-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990760 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-etc-openvswitch\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990389 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f28c398a-ea7e-4117-b7f9-fa78579062e1-log-socket\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.990980 4970 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991002 4970 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991020 4970 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-slash\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991035 4970 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991052 4970 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991068 4970 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991084 4970 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991101 4970 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991116 4970 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991132 4970 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17474edc-f114-4ee6-b6bb-95b55f1731ac-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991147 4970 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991177 4970 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991194 4970 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-log-socket\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.991210 4970 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.992140 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovnkube-script-lib\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.992526 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovnkube-config\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.992821 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f28c398a-ea7e-4117-b7f9-fa78579062e1-env-overrides\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.996444 4970 scope.go:117] "RemoveContainer" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.996806 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17474edc-f114-4ee6-b6bb-95b55f1731ac-kube-api-access-gsjj6" (OuterVolumeSpecName: "kube-api-access-gsjj6") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "kube-api-access-gsjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.999200 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:30:42 crc kubenswrapper[4970]: I1128 13:30:42.999512 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f28c398a-ea7e-4117-b7f9-fa78579062e1-ovn-node-metrics-cert\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.008650 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "17474edc-f114-4ee6-b6bb-95b55f1731ac" (UID: "17474edc-f114-4ee6-b6bb-95b55f1731ac"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.013986 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gcn\" (UniqueName: \"kubernetes.io/projected/f28c398a-ea7e-4117-b7f9-fa78579062e1-kube-api-access-z8gcn\") pod \"ovnkube-node-9mrsq\" (UID: \"f28c398a-ea7e-4117-b7f9-fa78579062e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.021781 4970 scope.go:117] "RemoveContainer" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.040956 4970 scope.go:117] "RemoveContainer" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.084425 4970 scope.go:117] "RemoveContainer" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.092076 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsjj6\" (UniqueName: \"kubernetes.io/projected/17474edc-f114-4ee6-b6bb-95b55f1731ac-kube-api-access-gsjj6\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.092480 4970 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17474edc-f114-4ee6-b6bb-95b55f1731ac-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.092489 4970 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17474edc-f114-4ee6-b6bb-95b55f1731ac-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.099589 4970 scope.go:117] "RemoveContainer" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.128854 4970 scope.go:117] "RemoveContainer" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.151520 4970 scope.go:117] "RemoveContainer" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.168821 4970 scope.go:117] "RemoveContainer" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.183397 4970 scope.go:117] "RemoveContainer" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.184783 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": container with ID starting with 1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf not found: ID does not exist" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.184814 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} err="failed to get container status \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": rpc error: code = NotFound desc = could not find container \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": container with ID starting with 1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.184840 4970 scope.go:117] "RemoveContainer" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.185145 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": container with ID starting with cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248 not found: ID does not exist" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.185168 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} err="failed to get container status \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": rpc error: code = NotFound desc = could not find container \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": container with ID starting with cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.185183 4970 scope.go:117] "RemoveContainer" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.185493 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": container with ID starting with 50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737 not found: ID does not exist" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.185512 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} err="failed to get container status \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": rpc error: code = NotFound desc = could not find container \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": container with ID starting with 50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.185525 4970 scope.go:117] "RemoveContainer" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.185758 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": container with ID starting with 189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8 not found: ID does not exist" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.185813 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} err="failed to get container status \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": rpc error: code = NotFound desc = could not find container \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": container with ID starting with 189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.185845 4970 scope.go:117] "RemoveContainer" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.186073 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": container with ID starting with 12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d not found: ID does not exist" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.186099 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} err="failed to get container status \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": rpc error: code = NotFound desc = could not find container \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": container with ID starting with 12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.186117 4970 scope.go:117] "RemoveContainer" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.186355 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": container with ID starting with a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d not found: ID does not exist" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.186383 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} err="failed to get container status \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": rpc error: code = NotFound desc = could not find container \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": container with ID starting with a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.186401 4970 scope.go:117] "RemoveContainer" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.186946 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": container with ID starting with 3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50 not found: ID does not exist" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.187001 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} err="failed to get container status \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": rpc error: code = NotFound desc = could not find container \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": container with ID starting with 3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.187030 4970 scope.go:117] "RemoveContainer" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.187250 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": container with ID starting with f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d not found: ID does not exist" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.187271 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} err="failed to get container status \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": rpc error: code = NotFound desc = could not find container \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": container with ID starting with f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.187284 4970 scope.go:117] "RemoveContainer" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" Nov 28 13:30:43 crc kubenswrapper[4970]: E1128 13:30:43.187747 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": container with ID starting with af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63 not found: ID does not exist" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.187773 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} err="failed to get container status \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": rpc error: code = NotFound desc = could not find container \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": container with ID starting with af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.187790 4970 scope.go:117] "RemoveContainer" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.188102 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} err="failed to get container status \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": rpc error: code = NotFound desc = could not find container \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": container with ID starting with 1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.188139 4970 scope.go:117] "RemoveContainer" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.188388 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} err="failed to get container status \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": rpc error: code = NotFound desc = could not find container \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": container with ID starting with cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.188413 4970 scope.go:117] "RemoveContainer" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.191334 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} err="failed to get container status \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": rpc error: code = NotFound desc = could not find container \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": container with ID starting with 50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.191364 4970 scope.go:117] "RemoveContainer" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.191609 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} err="failed to get container status \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": rpc error: code = NotFound desc = could not find container \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": container with ID starting with 189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.191637 4970 scope.go:117] "RemoveContainer" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.193092 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} err="failed to get container status \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": rpc error: code = NotFound desc = could not find container \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": container with ID starting with 12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.193117 4970 scope.go:117] "RemoveContainer" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.196614 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.196738 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} err="failed to get container status \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": rpc error: code = NotFound desc = could not find container \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": container with ID starting with a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.196767 4970 scope.go:117] "RemoveContainer" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.197122 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} err="failed to get container status \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": rpc error: code = NotFound desc = could not find container \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": container with ID starting with 3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.197141 4970 scope.go:117] "RemoveContainer" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.197455 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} err="failed to get container status \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": rpc error: code = NotFound desc = could not find container \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": container with ID starting with f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.197480 4970 scope.go:117] "RemoveContainer" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.197886 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} err="failed to get container status \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": rpc error: code = NotFound desc = could not find container \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": container with ID starting with af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.197908 4970 scope.go:117] "RemoveContainer" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.198913 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} err="failed to get container status \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": rpc error: code = NotFound desc = could not find container \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": container with ID starting with 1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.198934 4970 scope.go:117] "RemoveContainer" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.199700 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} err="failed to get container status \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": rpc error: code = NotFound desc = could not find container \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": container with ID starting with cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.199722 4970 scope.go:117] "RemoveContainer" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.200283 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} err="failed to get container status \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": rpc error: code = NotFound desc = could not find container \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": container with ID starting with 50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.200306 4970 scope.go:117] "RemoveContainer" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.200735 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} err="failed to get container status \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": rpc error: code = NotFound desc = could not find container \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": container with ID starting with 189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.200788 4970 scope.go:117] "RemoveContainer" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.201122 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} err="failed to get container status \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": rpc error: code = NotFound desc = could not find container \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": container with ID starting with 12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.201143 4970 scope.go:117] "RemoveContainer" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.201489 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} err="failed to get container status \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": rpc error: code = NotFound desc = could not find container \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": container with ID starting with a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.201507 4970 scope.go:117] "RemoveContainer" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.201843 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} err="failed to get container status \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": rpc error: code = NotFound desc = could not find container \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": container with ID starting with 3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.201868 4970 scope.go:117] "RemoveContainer" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.202166 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} err="failed to get container status \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": rpc error: code = NotFound desc = could not find container \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": container with ID starting with f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.202185 4970 scope.go:117] "RemoveContainer" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.202934 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} err="failed to get container status \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": rpc error: code = NotFound desc = could not find container \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": container with ID starting with af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.202958 4970 scope.go:117] "RemoveContainer" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.203711 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} err="failed to get container status \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": rpc error: code = NotFound desc = could not find container \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": container with ID starting with 1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.203767 4970 scope.go:117] "RemoveContainer" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.205345 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} err="failed to get container status \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": rpc error: code = NotFound desc = could not find container \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": container with ID starting with cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.205368 4970 scope.go:117] "RemoveContainer" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.205770 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} err="failed to get container status \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": rpc error: code = NotFound desc = could not find container \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": container with ID starting with 50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.205793 4970 scope.go:117] "RemoveContainer" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.206681 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} err="failed to get container status \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": rpc error: code = NotFound desc = could not find container \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": container with ID starting with 189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.206701 4970 scope.go:117] "RemoveContainer" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.207267 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} err="failed to get container status \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": rpc error: code = NotFound desc = could not find container \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": container with ID starting with 12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.207288 4970 scope.go:117] "RemoveContainer" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.208638 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} err="failed to get container status \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": rpc error: code = NotFound desc = could not find container \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": container with ID starting with a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.208740 4970 scope.go:117] "RemoveContainer" containerID="3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.209195 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50"} err="failed to get container status \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": rpc error: code = NotFound desc = could not find container \"3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50\": container with ID starting with 3dbb30bfa38ca50eb6b55645d501b246d1f9a16d6d4c6b40402ba6aac34f8e50 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.209251 4970 scope.go:117] "RemoveContainer" containerID="f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.210043 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d"} err="failed to get container status \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": rpc error: code = NotFound desc = could not find container \"f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d\": container with ID starting with f3080d04f3ff55e9ea3e5226bc08fed0e4937b1ebbab93b76c039066071aba2d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.210078 4970 scope.go:117] "RemoveContainer" containerID="af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.210443 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63"} err="failed to get container status \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": rpc error: code = NotFound desc = could not find container \"af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63\": container with ID starting with af322842df0e9f9249a0d4c5342059686d51e57a9259e9d7f99d3ab656639a63 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.210477 4970 scope.go:117] "RemoveContainer" containerID="1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.210813 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf"} err="failed to get container status \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": rpc error: code = NotFound desc = could not find container \"1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf\": container with ID starting with 1f1a97d1705ed1e76b0cf0f089eb6c47193e9abd9fb034819db37ed7fa020ecf not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.210843 4970 scope.go:117] "RemoveContainer" containerID="cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.211128 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248"} err="failed to get container status \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": rpc error: code = NotFound desc = could not find container \"cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248\": container with ID starting with cdd0fc0e7dc5e274448e9288c6b82d7b14c900064fbf885342824db7ffa53248 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.211155 4970 scope.go:117] "RemoveContainer" containerID="50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.211432 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737"} err="failed to get container status \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": rpc error: code = NotFound desc = could not find container \"50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737\": container with ID starting with 50bbc6bccc812634f10fce491d51f0bfc11b450bdd9435d97b5723bc25815737 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.211460 4970 scope.go:117] "RemoveContainer" containerID="189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.211719 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8"} err="failed to get container status \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": rpc error: code = NotFound desc = could not find container \"189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8\": container with ID starting with 189ed85c89ee8535a047fcd0640a468aa0ea5ce2f787ec1e5c83b10b2c4636f8 not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.211758 4970 scope.go:117] "RemoveContainer" containerID="12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.212091 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d"} err="failed to get container status \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": rpc error: code = NotFound desc = could not find container \"12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d\": container with ID starting with 12d0179126488b68ea59ae5ac2c1144408426c93d8ee99a546fe23c0e78bc29d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.212117 4970 scope.go:117] "RemoveContainer" containerID="a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.213380 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d"} err="failed to get container status \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": rpc error: code = NotFound desc = could not find container \"a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d\": container with ID starting with a7d93a763c00244c10708497ff36b83939050bf928342bd4ead223a08fda036d not found: ID does not exist" Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.328315 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6c6s9"] Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.333822 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6c6s9"] Nov 28 13:30:43 crc kubenswrapper[4970]: I1128 13:30:43.392987 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17474edc-f114-4ee6-b6bb-95b55f1731ac" path="/var/lib/kubelet/pods/17474edc-f114-4ee6-b6bb-95b55f1731ac/volumes" Nov 28 13:30:44 crc kubenswrapper[4970]: I1128 13:30:44.595787 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-krtxh_ddf11f1e-5631-4329-9db4-b75fed094c5f/kube-multus/0.log" Nov 28 13:30:44 crc kubenswrapper[4970]: I1128 13:30:44.595872 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-krtxh" event={"ID":"ddf11f1e-5631-4329-9db4-b75fed094c5f","Type":"ContainerStarted","Data":"db06ae7c7d873afa409c8e46598baebaa8e3e966ea3c1d4671e90f9b93eee5d2"} Nov 28 13:30:44 crc kubenswrapper[4970]: I1128 13:30:44.601867 4970 generic.go:334] "Generic (PLEG): container finished" podID="f28c398a-ea7e-4117-b7f9-fa78579062e1" containerID="bec751a8d46de757bc255f7748db964197bea355bdab8fa3a7cb94f62ff01bb9" exitCode=0 Nov 28 13:30:44 crc kubenswrapper[4970]: I1128 13:30:44.601917 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerDied","Data":"bec751a8d46de757bc255f7748db964197bea355bdab8fa3a7cb94f62ff01bb9"} Nov 28 13:30:44 crc kubenswrapper[4970]: I1128 13:30:44.601969 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"f3d523618b76ca91637e90c8dba72de62dc4228a9e161531f8ff7e236c68a7fe"} Nov 28 13:30:45 crc kubenswrapper[4970]: I1128 13:30:45.620479 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"94405cbe11d1240d5bd940b7e9b99423ae38f81f50854bd25ed2819901314c37"} Nov 28 13:30:45 crc kubenswrapper[4970]: I1128 13:30:45.620982 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"7405a6a5a334afcb4936605a6d1904d280a484d02167d0c36e55fdaa29cdc0be"} Nov 28 13:30:45 crc kubenswrapper[4970]: I1128 13:30:45.620997 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"5ab1f2087cad445eb3098e715aea1c1153f809979542f42f52edeb6cfb7d2461"} Nov 28 13:30:45 crc kubenswrapper[4970]: I1128 13:30:45.621010 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"3c3f2f97fd81c39a5458ccf53813b1a9455f3e83e6a848dfd999a78fcea39fde"} Nov 28 13:30:45 crc kubenswrapper[4970]: I1128 13:30:45.621021 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"e920f5a29acac5dc204a594b5183a4eb6b764772690733568d70f028c4b21611"} Nov 28 13:30:45 crc kubenswrapper[4970]: I1128 13:30:45.621032 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"1bdc0d7fc7653802676a54bb2ef6ac0faa2f68645c5a2c462fb7c159d3c08d7c"} Nov 28 13:30:48 crc kubenswrapper[4970]: I1128 13:30:48.644423 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"d63f709f0215033393d0d01f867588394d1f08e78abb820d8a96297eb9aef541"} Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.674330 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" event={"ID":"f28c398a-ea7e-4117-b7f9-fa78579062e1","Type":"ContainerStarted","Data":"6c42724f3f04de0879546099007a777fafe9101855a98c39c771636e47d12f48"} Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.675340 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.675379 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.675404 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.713702 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.715627 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" podStartSLOduration=10.715608221 podStartE2EDuration="10.715608221s" podCreationTimestamp="2025-11-28 13:30:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:30:52.71416455 +0000 UTC m=+663.567046360" watchObservedRunningTime="2025-11-28 13:30:52.715608221 +0000 UTC m=+663.568490041" Nov 28 13:30:52 crc kubenswrapper[4970]: I1128 13:30:52.730105 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.539913 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-gr7dz"] Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.541315 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.544356 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-w8nw4" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.544559 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.544974 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.551939 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87qd2\" (UniqueName: \"kubernetes.io/projected/c6a26f53-7c34-4833-9da7-d3bc979dfef1-kube-api-access-87qd2\") pod \"mariadb-operator-index-gr7dz\" (UID: \"c6a26f53-7c34-4833-9da7-d3bc979dfef1\") " pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.564462 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-gr7dz"] Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.653155 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87qd2\" (UniqueName: \"kubernetes.io/projected/c6a26f53-7c34-4833-9da7-d3bc979dfef1-kube-api-access-87qd2\") pod \"mariadb-operator-index-gr7dz\" (UID: \"c6a26f53-7c34-4833-9da7-d3bc979dfef1\") " pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.674392 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87qd2\" (UniqueName: \"kubernetes.io/projected/c6a26f53-7c34-4833-9da7-d3bc979dfef1-kube-api-access-87qd2\") pod \"mariadb-operator-index-gr7dz\" (UID: \"c6a26f53-7c34-4833-9da7-d3bc979dfef1\") " pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:11 crc kubenswrapper[4970]: I1128 13:31:11.865898 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:12 crc kubenswrapper[4970]: I1128 13:31:12.084902 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-gr7dz"] Nov 28 13:31:12 crc kubenswrapper[4970]: I1128 13:31:12.093920 4970 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:31:12 crc kubenswrapper[4970]: I1128 13:31:12.799515 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-gr7dz" event={"ID":"c6a26f53-7c34-4833-9da7-d3bc979dfef1","Type":"ContainerStarted","Data":"70ce106baabb5b434fdf07c0bd35b59d5b46acdd8dbd703a27c35ba0a4cd8151"} Nov 28 13:31:13 crc kubenswrapper[4970]: I1128 13:31:13.223304 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mrsq" Nov 28 13:31:14 crc kubenswrapper[4970]: I1128 13:31:14.508319 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-gr7dz"] Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.120761 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-w2zqv"] Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.122796 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.127407 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-w2zqv"] Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.206944 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgvh\" (UniqueName: \"kubernetes.io/projected/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3-kube-api-access-vkgvh\") pod \"mariadb-operator-index-w2zqv\" (UID: \"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3\") " pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.308363 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgvh\" (UniqueName: \"kubernetes.io/projected/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3-kube-api-access-vkgvh\") pod \"mariadb-operator-index-w2zqv\" (UID: \"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3\") " pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.330334 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgvh\" (UniqueName: \"kubernetes.io/projected/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3-kube-api-access-vkgvh\") pod \"mariadb-operator-index-w2zqv\" (UID: \"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3\") " pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:15 crc kubenswrapper[4970]: I1128 13:31:15.443157 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:19 crc kubenswrapper[4970]: I1128 13:31:19.396604 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-w2zqv"] Nov 28 13:31:19 crc kubenswrapper[4970]: I1128 13:31:19.842141 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-gr7dz" event={"ID":"c6a26f53-7c34-4833-9da7-d3bc979dfef1","Type":"ContainerStarted","Data":"2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387"} Nov 28 13:31:19 crc kubenswrapper[4970]: I1128 13:31:19.842357 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-gr7dz" podUID="c6a26f53-7c34-4833-9da7-d3bc979dfef1" containerName="registry-server" containerID="cri-o://2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387" gracePeriod=2 Nov 28 13:31:19 crc kubenswrapper[4970]: I1128 13:31:19.843853 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-w2zqv" event={"ID":"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3","Type":"ContainerStarted","Data":"b8279e1ad2c372acf6a50bd69758dc106a33f8c7c049b720ed4f3d98ff529639"} Nov 28 13:31:19 crc kubenswrapper[4970]: I1128 13:31:19.864371 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-gr7dz" podStartSLOduration=1.628541486 podStartE2EDuration="8.864348004s" podCreationTimestamp="2025-11-28 13:31:11 +0000 UTC" firstStartedPulling="2025-11-28 13:31:12.09366306 +0000 UTC m=+682.946544860" lastFinishedPulling="2025-11-28 13:31:19.329469568 +0000 UTC m=+690.182351378" observedRunningTime="2025-11-28 13:31:19.862857492 +0000 UTC m=+690.715739292" watchObservedRunningTime="2025-11-28 13:31:19.864348004 +0000 UTC m=+690.717229804" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.257678 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.286330 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87qd2\" (UniqueName: \"kubernetes.io/projected/c6a26f53-7c34-4833-9da7-d3bc979dfef1-kube-api-access-87qd2\") pod \"c6a26f53-7c34-4833-9da7-d3bc979dfef1\" (UID: \"c6a26f53-7c34-4833-9da7-d3bc979dfef1\") " Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.298390 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a26f53-7c34-4833-9da7-d3bc979dfef1-kube-api-access-87qd2" (OuterVolumeSpecName: "kube-api-access-87qd2") pod "c6a26f53-7c34-4833-9da7-d3bc979dfef1" (UID: "c6a26f53-7c34-4833-9da7-d3bc979dfef1"). InnerVolumeSpecName "kube-api-access-87qd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.387595 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87qd2\" (UniqueName: \"kubernetes.io/projected/c6a26f53-7c34-4833-9da7-d3bc979dfef1-kube-api-access-87qd2\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.856614 4970 generic.go:334] "Generic (PLEG): container finished" podID="c6a26f53-7c34-4833-9da7-d3bc979dfef1" containerID="2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387" exitCode=0 Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.856729 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-gr7dz" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.856761 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-gr7dz" event={"ID":"c6a26f53-7c34-4833-9da7-d3bc979dfef1","Type":"ContainerDied","Data":"2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387"} Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.856829 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-gr7dz" event={"ID":"c6a26f53-7c34-4833-9da7-d3bc979dfef1","Type":"ContainerDied","Data":"70ce106baabb5b434fdf07c0bd35b59d5b46acdd8dbd703a27c35ba0a4cd8151"} Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.856918 4970 scope.go:117] "RemoveContainer" containerID="2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.861970 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-w2zqv" event={"ID":"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3","Type":"ContainerStarted","Data":"6d6e8d2c141c112a6b3b522c25558a73900d98b7f358a55f70b227ac09cc4b29"} Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.891308 4970 scope.go:117] "RemoveContainer" containerID="2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.891928 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-w2zqv" podStartSLOduration=5.208433073 podStartE2EDuration="5.891893709s" podCreationTimestamp="2025-11-28 13:31:15 +0000 UTC" firstStartedPulling="2025-11-28 13:31:19.427170586 +0000 UTC m=+690.280052386" lastFinishedPulling="2025-11-28 13:31:20.110631222 +0000 UTC m=+690.963513022" observedRunningTime="2025-11-28 13:31:20.887466843 +0000 UTC m=+691.740348683" watchObservedRunningTime="2025-11-28 13:31:20.891893709 +0000 UTC m=+691.744775539" Nov 28 13:31:20 crc kubenswrapper[4970]: E1128 13:31:20.892361 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387\": container with ID starting with 2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387 not found: ID does not exist" containerID="2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.892446 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387"} err="failed to get container status \"2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387\": rpc error: code = NotFound desc = could not find container \"2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387\": container with ID starting with 2e37d29c16a78ca72518ff670b8e95bf6ba8a49b280c2f92e1f6f99abffc1387 not found: ID does not exist" Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.919622 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-gr7dz"] Nov 28 13:31:20 crc kubenswrapper[4970]: I1128 13:31:20.924920 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-gr7dz"] Nov 28 13:31:21 crc kubenswrapper[4970]: I1128 13:31:21.391240 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a26f53-7c34-4833-9da7-d3bc979dfef1" path="/var/lib/kubelet/pods/c6a26f53-7c34-4833-9da7-d3bc979dfef1/volumes" Nov 28 13:31:25 crc kubenswrapper[4970]: I1128 13:31:25.443882 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:25 crc kubenswrapper[4970]: I1128 13:31:25.443949 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:25 crc kubenswrapper[4970]: I1128 13:31:25.478120 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:25 crc kubenswrapper[4970]: I1128 13:31:25.928349 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.208924 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs"] Nov 28 13:31:27 crc kubenswrapper[4970]: E1128 13:31:27.209141 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a26f53-7c34-4833-9da7-d3bc979dfef1" containerName="registry-server" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.209181 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a26f53-7c34-4833-9da7-d3bc979dfef1" containerName="registry-server" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.209318 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a26f53-7c34-4833-9da7-d3bc979dfef1" containerName="registry-server" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.209993 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.216202 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-77hkb" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.229455 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs"] Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.285405 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.285507 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.285539 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps2rc\" (UniqueName: \"kubernetes.io/projected/f9534406-ffb1-48fd-8589-8f5d4bac63a4-kube-api-access-ps2rc\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.386650 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.386717 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps2rc\" (UniqueName: \"kubernetes.io/projected/f9534406-ffb1-48fd-8589-8f5d4bac63a4-kube-api-access-ps2rc\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.386814 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.387556 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.387690 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.430810 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps2rc\" (UniqueName: \"kubernetes.io/projected/f9534406-ffb1-48fd-8589-8f5d4bac63a4-kube-api-access-ps2rc\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.525396 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.759338 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs"] Nov 28 13:31:27 crc kubenswrapper[4970]: I1128 13:31:27.905985 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" event={"ID":"f9534406-ffb1-48fd-8589-8f5d4bac63a4","Type":"ContainerStarted","Data":"8351badd290a3a757b360b40f4950558a78f83236814b5f029e231bcb431412e"} Nov 28 13:31:30 crc kubenswrapper[4970]: I1128 13:31:30.926296 4970 generic.go:334] "Generic (PLEG): container finished" podID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerID="ed6e63b47f11e2b0b740d49500df04701d0adff42d441e36143c3cec25e03beb" exitCode=0 Nov 28 13:31:30 crc kubenswrapper[4970]: I1128 13:31:30.926483 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" event={"ID":"f9534406-ffb1-48fd-8589-8f5d4bac63a4","Type":"ContainerDied","Data":"ed6e63b47f11e2b0b740d49500df04701d0adff42d441e36143c3cec25e03beb"} Nov 28 13:31:31 crc kubenswrapper[4970]: I1128 13:31:31.938126 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" event={"ID":"f9534406-ffb1-48fd-8589-8f5d4bac63a4","Type":"ContainerStarted","Data":"4a6d69741843b952d01701dd4386ce70c005c8b765423a5b9aaeb7abd0d72438"} Nov 28 13:31:32 crc kubenswrapper[4970]: I1128 13:31:32.951468 4970 generic.go:334] "Generic (PLEG): container finished" podID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerID="4a6d69741843b952d01701dd4386ce70c005c8b765423a5b9aaeb7abd0d72438" exitCode=0 Nov 28 13:31:32 crc kubenswrapper[4970]: I1128 13:31:32.951595 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" event={"ID":"f9534406-ffb1-48fd-8589-8f5d4bac63a4","Type":"ContainerDied","Data":"4a6d69741843b952d01701dd4386ce70c005c8b765423a5b9aaeb7abd0d72438"} Nov 28 13:31:33 crc kubenswrapper[4970]: I1128 13:31:33.963193 4970 generic.go:334] "Generic (PLEG): container finished" podID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerID="168e3dadacb99c825ae65b522ee81165a4066f0f04f3d1c844ddbe57e6ec113a" exitCode=0 Nov 28 13:31:33 crc kubenswrapper[4970]: I1128 13:31:33.963284 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" event={"ID":"f9534406-ffb1-48fd-8589-8f5d4bac63a4","Type":"ContainerDied","Data":"168e3dadacb99c825ae65b522ee81165a4066f0f04f3d1c844ddbe57e6ec113a"} Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.239792 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.292075 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps2rc\" (UniqueName: \"kubernetes.io/projected/f9534406-ffb1-48fd-8589-8f5d4bac63a4-kube-api-access-ps2rc\") pod \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.292314 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-bundle\") pod \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.292429 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-util\") pod \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\" (UID: \"f9534406-ffb1-48fd-8589-8f5d4bac63a4\") " Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.293625 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-bundle" (OuterVolumeSpecName: "bundle") pod "f9534406-ffb1-48fd-8589-8f5d4bac63a4" (UID: "f9534406-ffb1-48fd-8589-8f5d4bac63a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.302560 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9534406-ffb1-48fd-8589-8f5d4bac63a4-kube-api-access-ps2rc" (OuterVolumeSpecName: "kube-api-access-ps2rc") pod "f9534406-ffb1-48fd-8589-8f5d4bac63a4" (UID: "f9534406-ffb1-48fd-8589-8f5d4bac63a4"). InnerVolumeSpecName "kube-api-access-ps2rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.323154 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-util" (OuterVolumeSpecName: "util") pod "f9534406-ffb1-48fd-8589-8f5d4bac63a4" (UID: "f9534406-ffb1-48fd-8589-8f5d4bac63a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.393758 4970 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.393798 4970 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9534406-ffb1-48fd-8589-8f5d4bac63a4-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.393810 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps2rc\" (UniqueName: \"kubernetes.io/projected/f9534406-ffb1-48fd-8589-8f5d4bac63a4-kube-api-access-ps2rc\") on node \"crc\" DevicePath \"\"" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.980796 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" event={"ID":"f9534406-ffb1-48fd-8589-8f5d4bac63a4","Type":"ContainerDied","Data":"8351badd290a3a757b360b40f4950558a78f83236814b5f029e231bcb431412e"} Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.981173 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8351badd290a3a757b360b40f4950558a78f83236814b5f029e231bcb431412e" Nov 28 13:31:35 crc kubenswrapper[4970]: I1128 13:31:35.980891 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.238209 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p"] Nov 28 13:31:40 crc kubenswrapper[4970]: E1128 13:31:40.238689 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="extract" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.238701 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="extract" Nov 28 13:31:40 crc kubenswrapper[4970]: E1128 13:31:40.238714 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="pull" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.238720 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="pull" Nov 28 13:31:40 crc kubenswrapper[4970]: E1128 13:31:40.238730 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="util" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.238737 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="util" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.238842 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" containerName="extract" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.239173 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.241570 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kbb4p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.242089 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.243503 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.294146 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p"] Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.358409 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-webhook-cert\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.358671 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-apiservice-cert\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.358729 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4wbk\" (UniqueName: \"kubernetes.io/projected/4729d0a5-be5a-4c85-83eb-f213d31f3755-kube-api-access-s4wbk\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.460298 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-apiservice-cert\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.460387 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4wbk\" (UniqueName: \"kubernetes.io/projected/4729d0a5-be5a-4c85-83eb-f213d31f3755-kube-api-access-s4wbk\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.460473 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-webhook-cert\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.469496 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-webhook-cert\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.470469 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-apiservice-cert\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.495247 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4wbk\" (UniqueName: \"kubernetes.io/projected/4729d0a5-be5a-4c85-83eb-f213d31f3755-kube-api-access-s4wbk\") pod \"mariadb-operator-controller-manager-7bdb74f78d-ct66p\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.558256 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:40 crc kubenswrapper[4970]: I1128 13:31:40.787909 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p"] Nov 28 13:31:41 crc kubenswrapper[4970]: I1128 13:31:41.013448 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" event={"ID":"4729d0a5-be5a-4c85-83eb-f213d31f3755","Type":"ContainerStarted","Data":"68c76262a55f624e75e6fbaf2ceae091a1c8831d91f622ecc75a1d733ac38d97"} Nov 28 13:31:46 crc kubenswrapper[4970]: I1128 13:31:46.046187 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" event={"ID":"4729d0a5-be5a-4c85-83eb-f213d31f3755","Type":"ContainerStarted","Data":"14802a15c862493a6fb83e83a11d6ce21c0646bffbd8ac711da3252e2b2ace6e"} Nov 28 13:31:46 crc kubenswrapper[4970]: I1128 13:31:46.046767 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:46 crc kubenswrapper[4970]: I1128 13:31:46.062563 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" podStartSLOduration=1.078715403 podStartE2EDuration="6.062543542s" podCreationTimestamp="2025-11-28 13:31:40 +0000 UTC" firstStartedPulling="2025-11-28 13:31:40.79968639 +0000 UTC m=+711.652568190" lastFinishedPulling="2025-11-28 13:31:45.783514519 +0000 UTC m=+716.636396329" observedRunningTime="2025-11-28 13:31:46.060976907 +0000 UTC m=+716.913858737" watchObservedRunningTime="2025-11-28 13:31:46.062543542 +0000 UTC m=+716.915425342" Nov 28 13:31:50 crc kubenswrapper[4970]: I1128 13:31:50.566210 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:31:51 crc kubenswrapper[4970]: I1128 13:31:51.334519 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:31:51 crc kubenswrapper[4970]: I1128 13:31:51.334606 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.316685 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts"] Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.318798 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.321416 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.333546 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts"] Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.485146 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpbj6\" (UniqueName: \"kubernetes.io/projected/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-kube-api-access-kpbj6\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.485200 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.485272 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.586649 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpbj6\" (UniqueName: \"kubernetes.io/projected/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-kube-api-access-kpbj6\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.586712 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.586751 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.587844 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.588429 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.615144 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpbj6\" (UniqueName: \"kubernetes.io/projected/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-kube-api-access-kpbj6\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.641413 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:31:56 crc kubenswrapper[4970]: I1128 13:31:56.890184 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts"] Nov 28 13:31:56 crc kubenswrapper[4970]: W1128 13:31:56.899461 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a02bc0_9c7f_440d_8cf3_0f21ddb0ff2c.slice/crio-e3f363fd756c8bea6f5778913096b6feaad7e8d12248e28e2027167ac3efeb57 WatchSource:0}: Error finding container e3f363fd756c8bea6f5778913096b6feaad7e8d12248e28e2027167ac3efeb57: Status 404 returned error can't find the container with id e3f363fd756c8bea6f5778913096b6feaad7e8d12248e28e2027167ac3efeb57 Nov 28 13:31:57 crc kubenswrapper[4970]: I1128 13:31:57.111434 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" event={"ID":"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c","Type":"ContainerStarted","Data":"e3f363fd756c8bea6f5778913096b6feaad7e8d12248e28e2027167ac3efeb57"} Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.141487 4970 generic.go:334] "Generic (PLEG): container finished" podID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerID="60d7b5d5f57d90d371df21b6e6c2bd33487d2ef2d0a81b1ffd33a7eb4d7e3d95" exitCode=0 Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.141857 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" event={"ID":"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c","Type":"ContainerDied","Data":"60d7b5d5f57d90d371df21b6e6c2bd33487d2ef2d0a81b1ffd33a7eb4d7e3d95"} Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.478143 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jwr9r"] Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.480347 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.504043 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwr9r"] Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.641880 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2nd\" (UniqueName: \"kubernetes.io/projected/f9bb171b-73a8-46b5-bc21-ecf897ae6312-kube-api-access-sl2nd\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.642060 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-catalog-content\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.642333 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-utilities\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.744197 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-catalog-content\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.744308 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-utilities\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.744368 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2nd\" (UniqueName: \"kubernetes.io/projected/f9bb171b-73a8-46b5-bc21-ecf897ae6312-kube-api-access-sl2nd\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.744920 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-utilities\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.744925 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-catalog-content\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.775452 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2nd\" (UniqueName: \"kubernetes.io/projected/f9bb171b-73a8-46b5-bc21-ecf897ae6312-kube-api-access-sl2nd\") pod \"community-operators-jwr9r\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:00 crc kubenswrapper[4970]: I1128 13:32:00.810123 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.066266 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-689wj"] Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.070822 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.084054 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-689wj"] Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.250184 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28jfp\" (UniqueName: \"kubernetes.io/projected/8662a1e5-aa56-4150-bc05-815f90bd16cd-kube-api-access-28jfp\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.250359 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-utilities\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.250421 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-catalog-content\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.306318 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwr9r"] Nov 28 13:32:01 crc kubenswrapper[4970]: W1128 13:32:01.311731 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9bb171b_73a8_46b5_bc21_ecf897ae6312.slice/crio-f1c325697b5cd8749cfedb4c9c72d7ec9c94c1017e4e14446c3501aa6cb633d7 WatchSource:0}: Error finding container f1c325697b5cd8749cfedb4c9c72d7ec9c94c1017e4e14446c3501aa6cb633d7: Status 404 returned error can't find the container with id f1c325697b5cd8749cfedb4c9c72d7ec9c94c1017e4e14446c3501aa6cb633d7 Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.351497 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28jfp\" (UniqueName: \"kubernetes.io/projected/8662a1e5-aa56-4150-bc05-815f90bd16cd-kube-api-access-28jfp\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.351562 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-utilities\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.351596 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-catalog-content\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.352005 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-catalog-content\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.352104 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-utilities\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.370351 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28jfp\" (UniqueName: \"kubernetes.io/projected/8662a1e5-aa56-4150-bc05-815f90bd16cd-kube-api-access-28jfp\") pod \"redhat-operators-689wj\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.402107 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:01 crc kubenswrapper[4970]: I1128 13:32:01.821422 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-689wj"] Nov 28 13:32:02 crc kubenswrapper[4970]: W1128 13:32:02.122746 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8662a1e5_aa56_4150_bc05_815f90bd16cd.slice/crio-340a5382bf45b3fcad9e50bb85fcd685bc0450d233618a9ae2f4d3ef65a6cced WatchSource:0}: Error finding container 340a5382bf45b3fcad9e50bb85fcd685bc0450d233618a9ae2f4d3ef65a6cced: Status 404 returned error can't find the container with id 340a5382bf45b3fcad9e50bb85fcd685bc0450d233618a9ae2f4d3ef65a6cced Nov 28 13:32:02 crc kubenswrapper[4970]: I1128 13:32:02.168400 4970 generic.go:334] "Generic (PLEG): container finished" podID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerID="fafcaf396d4893977df7b8e7b7b5d560aa4983ccd422197ff38545e9bdf3813d" exitCode=0 Nov 28 13:32:02 crc kubenswrapper[4970]: I1128 13:32:02.168486 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwr9r" event={"ID":"f9bb171b-73a8-46b5-bc21-ecf897ae6312","Type":"ContainerDied","Data":"fafcaf396d4893977df7b8e7b7b5d560aa4983ccd422197ff38545e9bdf3813d"} Nov 28 13:32:02 crc kubenswrapper[4970]: I1128 13:32:02.168523 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwr9r" event={"ID":"f9bb171b-73a8-46b5-bc21-ecf897ae6312","Type":"ContainerStarted","Data":"f1c325697b5cd8749cfedb4c9c72d7ec9c94c1017e4e14446c3501aa6cb633d7"} Nov 28 13:32:02 crc kubenswrapper[4970]: I1128 13:32:02.174199 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerStarted","Data":"340a5382bf45b3fcad9e50bb85fcd685bc0450d233618a9ae2f4d3ef65a6cced"} Nov 28 13:32:03 crc kubenswrapper[4970]: I1128 13:32:03.186275 4970 generic.go:334] "Generic (PLEG): container finished" podID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerID="0ba2f45b01a634cc40e7e9ce3535b5bcd447fe9de13a002d1f0023c5ace25b31" exitCode=0 Nov 28 13:32:03 crc kubenswrapper[4970]: I1128 13:32:03.186360 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerDied","Data":"0ba2f45b01a634cc40e7e9ce3535b5bcd447fe9de13a002d1f0023c5ace25b31"} Nov 28 13:32:06 crc kubenswrapper[4970]: I1128 13:32:06.209088 4970 generic.go:334] "Generic (PLEG): container finished" podID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerID="14e9138c669a93abd1a84869f7554a44b4076bcf41bc0a512c32a3322759cbb0" exitCode=0 Nov 28 13:32:06 crc kubenswrapper[4970]: I1128 13:32:06.209283 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" event={"ID":"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c","Type":"ContainerDied","Data":"14e9138c669a93abd1a84869f7554a44b4076bcf41bc0a512c32a3322759cbb0"} Nov 28 13:32:06 crc kubenswrapper[4970]: I1128 13:32:06.213168 4970 generic.go:334] "Generic (PLEG): container finished" podID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerID="9fee5d7d52ce7e19b59ff637d23594dc78c7bd72d07ed0e2b0726f660fe20b2f" exitCode=0 Nov 28 13:32:06 crc kubenswrapper[4970]: I1128 13:32:06.213239 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwr9r" event={"ID":"f9bb171b-73a8-46b5-bc21-ecf897ae6312","Type":"ContainerDied","Data":"9fee5d7d52ce7e19b59ff637d23594dc78c7bd72d07ed0e2b0726f660fe20b2f"} Nov 28 13:32:06 crc kubenswrapper[4970]: I1128 13:32:06.216593 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerStarted","Data":"aaf6bc44cdcec6d739563488d1e130651e1ac2d983d5e350381122cb409815a5"} Nov 28 13:32:07 crc kubenswrapper[4970]: I1128 13:32:07.226751 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwr9r" event={"ID":"f9bb171b-73a8-46b5-bc21-ecf897ae6312","Type":"ContainerStarted","Data":"6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad"} Nov 28 13:32:07 crc kubenswrapper[4970]: I1128 13:32:07.229252 4970 generic.go:334] "Generic (PLEG): container finished" podID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerID="aaf6bc44cdcec6d739563488d1e130651e1ac2d983d5e350381122cb409815a5" exitCode=0 Nov 28 13:32:07 crc kubenswrapper[4970]: I1128 13:32:07.229344 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerDied","Data":"aaf6bc44cdcec6d739563488d1e130651e1ac2d983d5e350381122cb409815a5"} Nov 28 13:32:07 crc kubenswrapper[4970]: I1128 13:32:07.234725 4970 generic.go:334] "Generic (PLEG): container finished" podID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerID="d1550bc9a8317697e0c04585ecd4d84c144b8084101c26bca8414dc3dfda13e6" exitCode=0 Nov 28 13:32:07 crc kubenswrapper[4970]: I1128 13:32:07.234764 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" event={"ID":"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c","Type":"ContainerDied","Data":"d1550bc9a8317697e0c04585ecd4d84c144b8084101c26bca8414dc3dfda13e6"} Nov 28 13:32:07 crc kubenswrapper[4970]: I1128 13:32:07.257412 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jwr9r" podStartSLOduration=2.684746745 podStartE2EDuration="7.257383246s" podCreationTimestamp="2025-11-28 13:32:00 +0000 UTC" firstStartedPulling="2025-11-28 13:32:02.172903034 +0000 UTC m=+733.025784854" lastFinishedPulling="2025-11-28 13:32:06.745539545 +0000 UTC m=+737.598421355" observedRunningTime="2025-11-28 13:32:07.253880005 +0000 UTC m=+738.106761835" watchObservedRunningTime="2025-11-28 13:32:07.257383246 +0000 UTC m=+738.110265086" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.241631 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerStarted","Data":"d800e940f815b147b534d14c787ece37979cfbf8e8e38ad9dae376807f2b2f13"} Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.261088 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-689wj" podStartSLOduration=2.716305546 podStartE2EDuration="7.261069726s" podCreationTimestamp="2025-11-28 13:32:01 +0000 UTC" firstStartedPulling="2025-11-28 13:32:03.189629939 +0000 UTC m=+734.042511749" lastFinishedPulling="2025-11-28 13:32:07.734394089 +0000 UTC m=+738.587275929" observedRunningTime="2025-11-28 13:32:08.26086955 +0000 UTC m=+739.113751350" watchObservedRunningTime="2025-11-28 13:32:08.261069726 +0000 UTC m=+739.113951526" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.474825 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.656816 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-util\") pod \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.656877 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpbj6\" (UniqueName: \"kubernetes.io/projected/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-kube-api-access-kpbj6\") pod \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.656917 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-bundle\") pod \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\" (UID: \"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c\") " Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.657887 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-bundle" (OuterVolumeSpecName: "bundle") pod "03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" (UID: "03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.666324 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-util" (OuterVolumeSpecName: "util") pod "03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" (UID: "03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.668417 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-kube-api-access-kpbj6" (OuterVolumeSpecName: "kube-api-access-kpbj6") pod "03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" (UID: "03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c"). InnerVolumeSpecName "kube-api-access-kpbj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.758517 4970 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.758756 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpbj6\" (UniqueName: \"kubernetes.io/projected/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-kube-api-access-kpbj6\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:08 crc kubenswrapper[4970]: I1128 13:32:08.758768 4970 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:09 crc kubenswrapper[4970]: I1128 13:32:09.249169 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" event={"ID":"03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c","Type":"ContainerDied","Data":"e3f363fd756c8bea6f5778913096b6feaad7e8d12248e28e2027167ac3efeb57"} Nov 28 13:32:09 crc kubenswrapper[4970]: I1128 13:32:09.249228 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f363fd756c8bea6f5778913096b6feaad7e8d12248e28e2027167ac3efeb57" Nov 28 13:32:09 crc kubenswrapper[4970]: I1128 13:32:09.249250 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts" Nov 28 13:32:10 crc kubenswrapper[4970]: I1128 13:32:10.810996 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:10 crc kubenswrapper[4970]: I1128 13:32:10.811083 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:10 crc kubenswrapper[4970]: I1128 13:32:10.874564 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:11 crc kubenswrapper[4970]: I1128 13:32:11.302200 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:11 crc kubenswrapper[4970]: I1128 13:32:11.403041 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:11 crc kubenswrapper[4970]: I1128 13:32:11.403112 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:12 crc kubenswrapper[4970]: I1128 13:32:12.444429 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-689wj" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="registry-server" probeResult="failure" output=< Nov 28 13:32:12 crc kubenswrapper[4970]: timeout: failed to connect service ":50051" within 1s Nov 28 13:32:12 crc kubenswrapper[4970]: > Nov 28 13:32:13 crc kubenswrapper[4970]: I1128 13:32:13.258675 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwr9r"] Nov 28 13:32:13 crc kubenswrapper[4970]: I1128 13:32:13.273954 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jwr9r" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="registry-server" containerID="cri-o://6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad" gracePeriod=2 Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.362180 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf"] Nov 28 13:32:17 crc kubenswrapper[4970]: E1128 13:32:17.363292 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="pull" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.363321 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="pull" Nov 28 13:32:17 crc kubenswrapper[4970]: E1128 13:32:17.363351 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="extract" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.363367 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="extract" Nov 28 13:32:17 crc kubenswrapper[4970]: E1128 13:32:17.363414 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="util" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.363430 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="util" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.363683 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c" containerName="extract" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.364381 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.366582 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.367093 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.368425 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.368684 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.374901 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7zfcz" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.374932 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf"] Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.477779 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04e4d11f-bb00-41b3-9047-0669f0e051c2-webhook-cert\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.477847 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04e4d11f-bb00-41b3-9047-0669f0e051c2-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.477888 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtdh\" (UniqueName: \"kubernetes.io/projected/04e4d11f-bb00-41b3-9047-0669f0e051c2-kube-api-access-5dtdh\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.579302 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04e4d11f-bb00-41b3-9047-0669f0e051c2-webhook-cert\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.579388 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04e4d11f-bb00-41b3-9047-0669f0e051c2-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.579436 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtdh\" (UniqueName: \"kubernetes.io/projected/04e4d11f-bb00-41b3-9047-0669f0e051c2-kube-api-access-5dtdh\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.587984 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04e4d11f-bb00-41b3-9047-0669f0e051c2-webhook-cert\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.593161 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04e4d11f-bb00-41b3-9047-0669f0e051c2-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.595707 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtdh\" (UniqueName: \"kubernetes.io/projected/04e4d11f-bb00-41b3-9047-0669f0e051c2-kube-api-access-5dtdh\") pod \"metallb-operator-controller-manager-7b8bd764cc-xwfzf\" (UID: \"04e4d11f-bb00-41b3-9047-0669f0e051c2\") " pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.698199 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:17 crc kubenswrapper[4970]: I1128 13:32:17.949697 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf"] Nov 28 13:32:17 crc kubenswrapper[4970]: W1128 13:32:17.952615 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04e4d11f_bb00_41b3_9047_0669f0e051c2.slice/crio-50f8d6f5ae2794114a4676e0bbcde91db5a1251ba3e3eae7eb7b395eb6ed148f WatchSource:0}: Error finding container 50f8d6f5ae2794114a4676e0bbcde91db5a1251ba3e3eae7eb7b395eb6ed148f: Status 404 returned error can't find the container with id 50f8d6f5ae2794114a4676e0bbcde91db5a1251ba3e3eae7eb7b395eb6ed148f Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.441556 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-778f645448-g7nc5"] Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.442515 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.444972 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gshkg" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.445123 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.445879 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.456908 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-778f645448-g7nc5"] Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.494077 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgxhp\" (UniqueName: \"kubernetes.io/projected/3c970e39-c0b1-4690-8e0a-f925a49d72a9-kube-api-access-xgxhp\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.494175 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c970e39-c0b1-4690-8e0a-f925a49d72a9-webhook-cert\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.494202 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c970e39-c0b1-4690-8e0a-f925a49d72a9-apiservice-cert\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.595755 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgxhp\" (UniqueName: \"kubernetes.io/projected/3c970e39-c0b1-4690-8e0a-f925a49d72a9-kube-api-access-xgxhp\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.595910 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c970e39-c0b1-4690-8e0a-f925a49d72a9-webhook-cert\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.595967 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c970e39-c0b1-4690-8e0a-f925a49d72a9-apiservice-cert\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.603920 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c970e39-c0b1-4690-8e0a-f925a49d72a9-webhook-cert\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.615194 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c970e39-c0b1-4690-8e0a-f925a49d72a9-apiservice-cert\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.631869 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgxhp\" (UniqueName: \"kubernetes.io/projected/3c970e39-c0b1-4690-8e0a-f925a49d72a9-kube-api-access-xgxhp\") pod \"metallb-operator-webhook-server-778f645448-g7nc5\" (UID: \"3c970e39-c0b1-4690-8e0a-f925a49d72a9\") " pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.763333 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:18 crc kubenswrapper[4970]: I1128 13:32:18.979532 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-778f645448-g7nc5"] Nov 28 13:32:18 crc kubenswrapper[4970]: W1128 13:32:18.988510 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c970e39_c0b1_4690_8e0a_f925a49d72a9.slice/crio-e711ee9618c9d5530f17f0cf209f564506a9e4ce426395e38df1cceea34f6bfc WatchSource:0}: Error finding container e711ee9618c9d5530f17f0cf209f564506a9e4ce426395e38df1cceea34f6bfc: Status 404 returned error can't find the container with id e711ee9618c9d5530f17f0cf209f564506a9e4ce426395e38df1cceea34f6bfc Nov 28 13:32:19 crc kubenswrapper[4970]: I1128 13:32:19.991727 4970 generic.go:334] "Generic (PLEG): container finished" podID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerID="6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad" exitCode=0 Nov 28 13:32:19 crc kubenswrapper[4970]: I1128 13:32:19.991798 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwr9r" event={"ID":"f9bb171b-73a8-46b5-bc21-ecf897ae6312","Type":"ContainerDied","Data":"6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad"} Nov 28 13:32:20 crc kubenswrapper[4970]: E1128 13:32:20.811732 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad is running failed: container process not found" containerID="6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:32:20 crc kubenswrapper[4970]: E1128 13:32:20.812367 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad is running failed: container process not found" containerID="6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:32:20 crc kubenswrapper[4970]: E1128 13:32:20.813006 4970 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad is running failed: container process not found" containerID="6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad" cmd=["grpc_health_probe","-addr=:50051"] Nov 28 13:32:20 crc kubenswrapper[4970]: E1128 13:32:20.813098 4970 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-jwr9r" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="registry-server" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.001379 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" event={"ID":"04e4d11f-bb00-41b3-9047-0669f0e051c2","Type":"ContainerStarted","Data":"50f8d6f5ae2794114a4676e0bbcde91db5a1251ba3e3eae7eb7b395eb6ed148f"} Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.003762 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" event={"ID":"3c970e39-c0b1-4690-8e0a-f925a49d72a9","Type":"ContainerStarted","Data":"e711ee9618c9d5530f17f0cf209f564506a9e4ce426395e38df1cceea34f6bfc"} Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.248742 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.333851 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.333912 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.356845 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-utilities\") pod \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.356918 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-catalog-content\") pod \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.356992 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl2nd\" (UniqueName: \"kubernetes.io/projected/f9bb171b-73a8-46b5-bc21-ecf897ae6312-kube-api-access-sl2nd\") pod \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\" (UID: \"f9bb171b-73a8-46b5-bc21-ecf897ae6312\") " Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.357801 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-utilities" (OuterVolumeSpecName: "utilities") pod "f9bb171b-73a8-46b5-bc21-ecf897ae6312" (UID: "f9bb171b-73a8-46b5-bc21-ecf897ae6312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.365531 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bb171b-73a8-46b5-bc21-ecf897ae6312-kube-api-access-sl2nd" (OuterVolumeSpecName: "kube-api-access-sl2nd") pod "f9bb171b-73a8-46b5-bc21-ecf897ae6312" (UID: "f9bb171b-73a8-46b5-bc21-ecf897ae6312"). InnerVolumeSpecName "kube-api-access-sl2nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.414265 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9bb171b-73a8-46b5-bc21-ecf897ae6312" (UID: "f9bb171b-73a8-46b5-bc21-ecf897ae6312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.458563 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.458606 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl2nd\" (UniqueName: \"kubernetes.io/projected/f9bb171b-73a8-46b5-bc21-ecf897ae6312-kube-api-access-sl2nd\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.458621 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9bb171b-73a8-46b5-bc21-ecf897ae6312-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.463598 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:21 crc kubenswrapper[4970]: I1128 13:32:21.511841 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.012073 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwr9r" Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.014435 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwr9r" event={"ID":"f9bb171b-73a8-46b5-bc21-ecf897ae6312","Type":"ContainerDied","Data":"f1c325697b5cd8749cfedb4c9c72d7ec9c94c1017e4e14446c3501aa6cb633d7"} Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.014502 4970 scope.go:117] "RemoveContainer" containerID="6e14144ee63ede436e8a824d0674806424ed6827f8389bd8981b4aefefd1f1ad" Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.039653 4970 scope.go:117] "RemoveContainer" containerID="9fee5d7d52ce7e19b59ff637d23594dc78c7bd72d07ed0e2b0726f660fe20b2f" Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.048294 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwr9r"] Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.051185 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jwr9r"] Nov 28 13:32:22 crc kubenswrapper[4970]: I1128 13:32:22.076428 4970 scope.go:117] "RemoveContainer" containerID="fafcaf396d4893977df7b8e7b7b5d560aa4983ccd422197ff38545e9bdf3813d" Nov 28 13:32:23 crc kubenswrapper[4970]: I1128 13:32:23.390399 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" path="/var/lib/kubelet/pods/f9bb171b-73a8-46b5-bc21-ecf897ae6312/volumes" Nov 28 13:32:24 crc kubenswrapper[4970]: I1128 13:32:24.025592 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" event={"ID":"04e4d11f-bb00-41b3-9047-0669f0e051c2","Type":"ContainerStarted","Data":"75074bda77caeb8b24a52a1d35763bebfc60df7b79a64ede0e248b4f9a496f4f"} Nov 28 13:32:24 crc kubenswrapper[4970]: I1128 13:32:24.025789 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:24 crc kubenswrapper[4970]: I1128 13:32:24.055274 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" podStartSLOduration=1.685897136 podStartE2EDuration="7.05525347s" podCreationTimestamp="2025-11-28 13:32:17 +0000 UTC" firstStartedPulling="2025-11-28 13:32:17.956615677 +0000 UTC m=+748.809497477" lastFinishedPulling="2025-11-28 13:32:23.325972011 +0000 UTC m=+754.178853811" observedRunningTime="2025-11-28 13:32:24.051851193 +0000 UTC m=+754.904733003" watchObservedRunningTime="2025-11-28 13:32:24.05525347 +0000 UTC m=+754.908135270" Nov 28 13:32:26 crc kubenswrapper[4970]: I1128 13:32:26.257515 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-689wj"] Nov 28 13:32:26 crc kubenswrapper[4970]: I1128 13:32:26.258184 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-689wj" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="registry-server" containerID="cri-o://d800e940f815b147b534d14c787ece37979cfbf8e8e38ad9dae376807f2b2f13" gracePeriod=2 Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.070526 4970 generic.go:334] "Generic (PLEG): container finished" podID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerID="d800e940f815b147b534d14c787ece37979cfbf8e8e38ad9dae376807f2b2f13" exitCode=0 Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.070666 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerDied","Data":"d800e940f815b147b534d14c787ece37979cfbf8e8e38ad9dae376807f2b2f13"} Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.070842 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-689wj" event={"ID":"8662a1e5-aa56-4150-bc05-815f90bd16cd","Type":"ContainerDied","Data":"340a5382bf45b3fcad9e50bb85fcd685bc0450d233618a9ae2f4d3ef65a6cced"} Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.070861 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340a5382bf45b3fcad9e50bb85fcd685bc0450d233618a9ae2f4d3ef65a6cced" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.153302 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.189632 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-catalog-content\") pod \"8662a1e5-aa56-4150-bc05-815f90bd16cd\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.290861 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-utilities\") pod \"8662a1e5-aa56-4150-bc05-815f90bd16cd\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.291613 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28jfp\" (UniqueName: \"kubernetes.io/projected/8662a1e5-aa56-4150-bc05-815f90bd16cd-kube-api-access-28jfp\") pod \"8662a1e5-aa56-4150-bc05-815f90bd16cd\" (UID: \"8662a1e5-aa56-4150-bc05-815f90bd16cd\") " Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.291765 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-utilities" (OuterVolumeSpecName: "utilities") pod "8662a1e5-aa56-4150-bc05-815f90bd16cd" (UID: "8662a1e5-aa56-4150-bc05-815f90bd16cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.292024 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.301541 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8662a1e5-aa56-4150-bc05-815f90bd16cd-kube-api-access-28jfp" (OuterVolumeSpecName: "kube-api-access-28jfp") pod "8662a1e5-aa56-4150-bc05-815f90bd16cd" (UID: "8662a1e5-aa56-4150-bc05-815f90bd16cd"). InnerVolumeSpecName "kube-api-access-28jfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.306073 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8662a1e5-aa56-4150-bc05-815f90bd16cd" (UID: "8662a1e5-aa56-4150-bc05-815f90bd16cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.392994 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28jfp\" (UniqueName: \"kubernetes.io/projected/8662a1e5-aa56-4150-bc05-815f90bd16cd-kube-api-access-28jfp\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:27 crc kubenswrapper[4970]: I1128 13:32:27.393257 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8662a1e5-aa56-4150-bc05-815f90bd16cd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:32:28 crc kubenswrapper[4970]: I1128 13:32:28.079691 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" event={"ID":"3c970e39-c0b1-4690-8e0a-f925a49d72a9","Type":"ContainerStarted","Data":"e8bd0d8df06b440d83b0882d4e1ff792aab711eff44dbb278b442a80fb35ed3a"} Nov 28 13:32:28 crc kubenswrapper[4970]: I1128 13:32:28.080175 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:28 crc kubenswrapper[4970]: I1128 13:32:28.079732 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-689wj" Nov 28 13:32:28 crc kubenswrapper[4970]: I1128 13:32:28.110471 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" podStartSLOduration=1.697134894 podStartE2EDuration="10.110444119s" podCreationTimestamp="2025-11-28 13:32:18 +0000 UTC" firstStartedPulling="2025-11-28 13:32:18.992804699 +0000 UTC m=+749.845686499" lastFinishedPulling="2025-11-28 13:32:27.406113924 +0000 UTC m=+758.258995724" observedRunningTime="2025-11-28 13:32:28.105732144 +0000 UTC m=+758.958613944" watchObservedRunningTime="2025-11-28 13:32:28.110444119 +0000 UTC m=+758.963325919" Nov 28 13:32:28 crc kubenswrapper[4970]: I1128 13:32:28.158295 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-689wj"] Nov 28 13:32:28 crc kubenswrapper[4970]: I1128 13:32:28.168504 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-689wj"] Nov 28 13:32:29 crc kubenswrapper[4970]: I1128 13:32:29.387835 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" path="/var/lib/kubelet/pods/8662a1e5-aa56-4150-bc05-815f90bd16cd/volumes" Nov 28 13:32:38 crc kubenswrapper[4970]: I1128 13:32:38.767863 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-778f645448-g7nc5" Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.333907 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.334461 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.334510 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.335150 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86a03fe6c83c6ac3411e98ed1337717f0b27b46f31a13d39550e07889da6badd"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.335232 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://86a03fe6c83c6ac3411e98ed1337717f0b27b46f31a13d39550e07889da6badd" gracePeriod=600 Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.541521 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="86a03fe6c83c6ac3411e98ed1337717f0b27b46f31a13d39550e07889da6badd" exitCode=0 Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.541637 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"86a03fe6c83c6ac3411e98ed1337717f0b27b46f31a13d39550e07889da6badd"} Nov 28 13:32:51 crc kubenswrapper[4970]: I1128 13:32:51.542014 4970 scope.go:117] "RemoveContainer" containerID="1c9fff9237be9f51a58f9e7060799a552c40dc5fcc7d7c71f57ed50492cd23cf" Nov 28 13:32:52 crc kubenswrapper[4970]: I1128 13:32:52.551742 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"2ad74f7ddfaa8d711be3a8043f5b9573ad4e845f67f91479eefe9466a3a483c3"} Nov 28 13:32:57 crc kubenswrapper[4970]: I1128 13:32:57.702042 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b8bd764cc-xwfzf" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487307 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7qf2x"] Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.487516 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="registry-server" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487528 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="registry-server" Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.487539 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="extract-utilities" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487545 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="extract-utilities" Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.487559 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="registry-server" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487566 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="registry-server" Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.487578 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="extract-content" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487584 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="extract-content" Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.487597 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="extract-content" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487603 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="extract-content" Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.487612 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="extract-utilities" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487618 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="extract-utilities" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487729 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="8662a1e5-aa56-4150-bc05-815f90bd16cd" containerName="registry-server" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.487742 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bb171b-73a8-46b5-bc21-ecf897ae6312" containerName="registry-server" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.489504 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.491301 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.491333 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.493863 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xfpjs" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.517173 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5"] Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.518352 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.521020 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.538170 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5"] Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.589362 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vj6nx"] Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.590296 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.593620 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.593669 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jschb" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.593628 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.594425 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599269 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-metrics-certs\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599327 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-metrics\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599438 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdsq\" (UniqueName: \"kubernetes.io/projected/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-kube-api-access-hbdsq\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599640 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-sockets\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599703 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-conf\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599896 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-reloader\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.599959 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-startup\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.609463 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-7s5w4"] Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.610458 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.611698 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.620183 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-7s5w4"] Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701198 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-metrics-certs\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701314 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-conf\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701348 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce7e9380-adac-4723-8ced-16693bce1923-metallb-excludel2\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701376 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701421 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-reloader\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701445 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-startup\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701491 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xs2f\" (UniqueName: \"kubernetes.io/projected/539244ae-76b7-443b-9352-5d8d2f8da8e9-kube-api-access-7xs2f\") pod \"frr-k8s-webhook-server-7fcb986d4-tdbd5\" (UID: \"539244ae-76b7-443b-9352-5d8d2f8da8e9\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701516 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-metrics-certs\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701534 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/539244ae-76b7-443b-9352-5d8d2f8da8e9-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tdbd5\" (UID: \"539244ae-76b7-443b-9352-5d8d2f8da8e9\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701587 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-metrics\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701604 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdsq\" (UniqueName: \"kubernetes.io/projected/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-kube-api-access-hbdsq\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701666 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-sockets\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701682 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7pbm\" (UniqueName: \"kubernetes.io/projected/ce7e9380-adac-4723-8ced-16693bce1923-kube-api-access-h7pbm\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701902 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-conf\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701960 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-reloader\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.701989 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-metrics\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.702186 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-sockets\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.702596 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-frr-startup\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.709112 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-metrics-certs\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.722778 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdsq\" (UniqueName: \"kubernetes.io/projected/d8109ee2-cb6f-4706-a9d0-93fbec9b4234-kube-api-access-hbdsq\") pod \"frr-k8s-7qf2x\" (UID: \"d8109ee2-cb6f-4706-a9d0-93fbec9b4234\") " pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803146 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7pbm\" (UniqueName: \"kubernetes.io/projected/ce7e9380-adac-4723-8ced-16693bce1923-kube-api-access-h7pbm\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803202 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1eb09e8-cbb7-416b-9683-a42a8b611239-metrics-certs\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803253 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-metrics-certs\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803302 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce7e9380-adac-4723-8ced-16693bce1923-metallb-excludel2\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803328 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803363 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddhjw\" (UniqueName: \"kubernetes.io/projected/e1eb09e8-cbb7-416b-9683-a42a8b611239-kube-api-access-ddhjw\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803415 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1eb09e8-cbb7-416b-9683-a42a8b611239-cert\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803463 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xs2f\" (UniqueName: \"kubernetes.io/projected/539244ae-76b7-443b-9352-5d8d2f8da8e9-kube-api-access-7xs2f\") pod \"frr-k8s-webhook-server-7fcb986d4-tdbd5\" (UID: \"539244ae-76b7-443b-9352-5d8d2f8da8e9\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.803482 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/539244ae-76b7-443b-9352-5d8d2f8da8e9-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tdbd5\" (UID: \"539244ae-76b7-443b-9352-5d8d2f8da8e9\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.804192 4970 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.804276 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-metrics-certs podName:ce7e9380-adac-4723-8ced-16693bce1923 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:59.304259623 +0000 UTC m=+790.157141423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-metrics-certs") pod "speaker-vj6nx" (UID: "ce7e9380-adac-4723-8ced-16693bce1923") : secret "speaker-certs-secret" not found Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.804342 4970 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 28 13:32:58 crc kubenswrapper[4970]: E1128 13:32:58.804385 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist podName:ce7e9380-adac-4723-8ced-16693bce1923 nodeName:}" failed. No retries permitted until 2025-11-28 13:32:59.304375856 +0000 UTC m=+790.157257656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist") pod "speaker-vj6nx" (UID: "ce7e9380-adac-4723-8ced-16693bce1923") : secret "metallb-memberlist" not found Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.804650 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce7e9380-adac-4723-8ced-16693bce1923-metallb-excludel2\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.806942 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/539244ae-76b7-443b-9352-5d8d2f8da8e9-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tdbd5\" (UID: \"539244ae-76b7-443b-9352-5d8d2f8da8e9\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.809054 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.828531 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xs2f\" (UniqueName: \"kubernetes.io/projected/539244ae-76b7-443b-9352-5d8d2f8da8e9-kube-api-access-7xs2f\") pod \"frr-k8s-webhook-server-7fcb986d4-tdbd5\" (UID: \"539244ae-76b7-443b-9352-5d8d2f8da8e9\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.829000 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7pbm\" (UniqueName: \"kubernetes.io/projected/ce7e9380-adac-4723-8ced-16693bce1923-kube-api-access-h7pbm\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.832590 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.906822 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1eb09e8-cbb7-416b-9683-a42a8b611239-cert\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.906924 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1eb09e8-cbb7-416b-9683-a42a8b611239-metrics-certs\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.907002 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddhjw\" (UniqueName: \"kubernetes.io/projected/e1eb09e8-cbb7-416b-9683-a42a8b611239-kube-api-access-ddhjw\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.909539 4970 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.912960 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1eb09e8-cbb7-416b-9683-a42a8b611239-metrics-certs\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.923203 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1eb09e8-cbb7-416b-9683-a42a8b611239-cert\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:58 crc kubenswrapper[4970]: I1128 13:32:58.938120 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddhjw\" (UniqueName: \"kubernetes.io/projected/e1eb09e8-cbb7-416b-9683-a42a8b611239-kube-api-access-ddhjw\") pod \"controller-f8648f98b-7s5w4\" (UID: \"e1eb09e8-cbb7-416b-9683-a42a8b611239\") " pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:59 crc kubenswrapper[4970]: W1128 13:32:59.220997 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod539244ae_76b7_443b_9352_5d8d2f8da8e9.slice/crio-f5add747c7d5d81a0debb0f03b42317a7d62857724e37dd7a3101015f944a85d WatchSource:0}: Error finding container f5add747c7d5d81a0debb0f03b42317a7d62857724e37dd7a3101015f944a85d: Status 404 returned error can't find the container with id f5add747c7d5d81a0debb0f03b42317a7d62857724e37dd7a3101015f944a85d Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.221055 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5"] Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.223728 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.313121 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.313252 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-metrics-certs\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:59 crc kubenswrapper[4970]: E1128 13:32:59.316582 4970 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 28 13:32:59 crc kubenswrapper[4970]: E1128 13:32:59.316642 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist podName:ce7e9380-adac-4723-8ced-16693bce1923 nodeName:}" failed. No retries permitted until 2025-11-28 13:33:00.316624769 +0000 UTC m=+791.169506579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist") pod "speaker-vj6nx" (UID: "ce7e9380-adac-4723-8ced-16693bce1923") : secret "metallb-memberlist" not found Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.320003 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-metrics-certs\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.421108 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-7s5w4"] Nov 28 13:32:59 crc kubenswrapper[4970]: W1128 13:32:59.424682 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1eb09e8_cbb7_416b_9683_a42a8b611239.slice/crio-b903170f23747c36c98a42d6cc3617bb6e058fb476377661fdf9845853486f26 WatchSource:0}: Error finding container b903170f23747c36c98a42d6cc3617bb6e058fb476377661fdf9845853486f26: Status 404 returned error can't find the container with id b903170f23747c36c98a42d6cc3617bb6e058fb476377661fdf9845853486f26 Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.593455 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-7s5w4" event={"ID":"e1eb09e8-cbb7-416b-9683-a42a8b611239","Type":"ContainerStarted","Data":"b903170f23747c36c98a42d6cc3617bb6e058fb476377661fdf9845853486f26"} Nov 28 13:32:59 crc kubenswrapper[4970]: I1128 13:32:59.594691 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" event={"ID":"539244ae-76b7-443b-9352-5d8d2f8da8e9","Type":"ContainerStarted","Data":"f5add747c7d5d81a0debb0f03b42317a7d62857724e37dd7a3101015f944a85d"} Nov 28 13:33:00 crc kubenswrapper[4970]: I1128 13:33:00.328006 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:33:00 crc kubenswrapper[4970]: I1128 13:33:00.344010 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce7e9380-adac-4723-8ced-16693bce1923-memberlist\") pod \"speaker-vj6nx\" (UID: \"ce7e9380-adac-4723-8ced-16693bce1923\") " pod="metallb-system/speaker-vj6nx" Nov 28 13:33:00 crc kubenswrapper[4970]: I1128 13:33:00.403477 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vj6nx" Nov 28 13:33:00 crc kubenswrapper[4970]: I1128 13:33:00.607834 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vj6nx" event={"ID":"ce7e9380-adac-4723-8ced-16693bce1923","Type":"ContainerStarted","Data":"92da71cc8a7a1a2b37029c8570633d22424636adf537c177a950a2b2534d932a"} Nov 28 13:33:00 crc kubenswrapper[4970]: I1128 13:33:00.610617 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-7s5w4" event={"ID":"e1eb09e8-cbb7-416b-9683-a42a8b611239","Type":"ContainerStarted","Data":"e9ac590a0e4402e26f2a77d5723c398e5a983831a6cf2a28bd9faae9fb26f44b"} Nov 28 13:33:00 crc kubenswrapper[4970]: I1128 13:33:00.617426 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"26af8608cf4141aa8bb6b6a8827b6290e5258c0511d191fb336f1de092203b59"} Nov 28 13:33:01 crc kubenswrapper[4970]: I1128 13:33:01.630956 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vj6nx" event={"ID":"ce7e9380-adac-4723-8ced-16693bce1923","Type":"ContainerStarted","Data":"295ec1e2948a451655b07ebea7079d80b32c53d7b5e15356c8ccdf227832ed69"} Nov 28 13:33:03 crc kubenswrapper[4970]: I1128 13:33:03.666098 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-7s5w4" event={"ID":"e1eb09e8-cbb7-416b-9683-a42a8b611239","Type":"ContainerStarted","Data":"54e4dadaac89a391914f7288879a2ee1ddffa298a4383beea2bc698da898ae37"} Nov 28 13:33:03 crc kubenswrapper[4970]: I1128 13:33:03.666738 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:33:03 crc kubenswrapper[4970]: I1128 13:33:03.687483 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-7s5w4" podStartSLOduration=1.9644630570000001 podStartE2EDuration="5.687454709s" podCreationTimestamp="2025-11-28 13:32:58 +0000 UTC" firstStartedPulling="2025-11-28 13:32:59.775588146 +0000 UTC m=+790.628469946" lastFinishedPulling="2025-11-28 13:33:03.498579798 +0000 UTC m=+794.351461598" observedRunningTime="2025-11-28 13:33:03.683749282 +0000 UTC m=+794.536631092" watchObservedRunningTime="2025-11-28 13:33:03.687454709 +0000 UTC m=+794.540336509" Nov 28 13:33:04 crc kubenswrapper[4970]: I1128 13:33:04.681262 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vj6nx" event={"ID":"ce7e9380-adac-4723-8ced-16693bce1923","Type":"ContainerStarted","Data":"c1358bcaeb47fad5ac2943e7225f777a01768020a54696c8e984607649b7c529"} Nov 28 13:33:04 crc kubenswrapper[4970]: I1128 13:33:04.681948 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vj6nx" Nov 28 13:33:04 crc kubenswrapper[4970]: I1128 13:33:04.699344 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vj6nx" podStartSLOduration=4.040443289 podStartE2EDuration="6.699325732s" podCreationTimestamp="2025-11-28 13:32:58 +0000 UTC" firstStartedPulling="2025-11-28 13:33:00.854347475 +0000 UTC m=+791.707229275" lastFinishedPulling="2025-11-28 13:33:03.513229918 +0000 UTC m=+794.366111718" observedRunningTime="2025-11-28 13:33:04.696999245 +0000 UTC m=+795.549881045" watchObservedRunningTime="2025-11-28 13:33:04.699325732 +0000 UTC m=+795.552207532" Nov 28 13:33:08 crc kubenswrapper[4970]: I1128 13:33:08.713637 4970 generic.go:334] "Generic (PLEG): container finished" podID="d8109ee2-cb6f-4706-a9d0-93fbec9b4234" containerID="5e6c4f6f286872ec34bbf2b503d88631df2c1150c8cca55632a7522a975a912e" exitCode=0 Nov 28 13:33:08 crc kubenswrapper[4970]: I1128 13:33:08.714175 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerDied","Data":"5e6c4f6f286872ec34bbf2b503d88631df2c1150c8cca55632a7522a975a912e"} Nov 28 13:33:08 crc kubenswrapper[4970]: I1128 13:33:08.716775 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" event={"ID":"539244ae-76b7-443b-9352-5d8d2f8da8e9","Type":"ContainerStarted","Data":"c5a63046b248db3136bc56052ec8864a11e67b7b4f9f7cc8820743fef939fba8"} Nov 28 13:33:08 crc kubenswrapper[4970]: I1128 13:33:08.717516 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:33:09 crc kubenswrapper[4970]: I1128 13:33:09.227317 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-7s5w4" Nov 28 13:33:09 crc kubenswrapper[4970]: I1128 13:33:09.243720 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" podStartSLOduration=2.943467308 podStartE2EDuration="11.243695223s" podCreationTimestamp="2025-11-28 13:32:58 +0000 UTC" firstStartedPulling="2025-11-28 13:32:59.22343745 +0000 UTC m=+790.076319250" lastFinishedPulling="2025-11-28 13:33:07.523665365 +0000 UTC m=+798.376547165" observedRunningTime="2025-11-28 13:33:08.754702306 +0000 UTC m=+799.607584126" watchObservedRunningTime="2025-11-28 13:33:09.243695223 +0000 UTC m=+800.096577023" Nov 28 13:33:09 crc kubenswrapper[4970]: I1128 13:33:09.726736 4970 generic.go:334] "Generic (PLEG): container finished" podID="d8109ee2-cb6f-4706-a9d0-93fbec9b4234" containerID="a8e6909965932d74170ceb86a3727b95a1bedf33426200a6215785048fc638f8" exitCode=0 Nov 28 13:33:09 crc kubenswrapper[4970]: I1128 13:33:09.726844 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerDied","Data":"a8e6909965932d74170ceb86a3727b95a1bedf33426200a6215785048fc638f8"} Nov 28 13:33:10 crc kubenswrapper[4970]: I1128 13:33:10.406970 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vj6nx" Nov 28 13:33:10 crc kubenswrapper[4970]: I1128 13:33:10.734867 4970 generic.go:334] "Generic (PLEG): container finished" podID="d8109ee2-cb6f-4706-a9d0-93fbec9b4234" containerID="6a48e885bf9e5e9008fb34fbe84b494e3260269460a843a8849c9f1320e06f62" exitCode=0 Nov 28 13:33:10 crc kubenswrapper[4970]: I1128 13:33:10.734927 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerDied","Data":"6a48e885bf9e5e9008fb34fbe84b494e3260269460a843a8849c9f1320e06f62"} Nov 28 13:33:11 crc kubenswrapper[4970]: I1128 13:33:11.756491 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"3c6595a30cbc69aeb4a2b0569001b72a3ab2511f2c8e1f8e4f780c1f8135762b"} Nov 28 13:33:11 crc kubenswrapper[4970]: I1128 13:33:11.756753 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"a7f16b5b9dcbfbfb53937378f993d8bc9f0ab66df202d13157d9f8a66f90c1f3"} Nov 28 13:33:11 crc kubenswrapper[4970]: I1128 13:33:11.756769 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"e10c17882855477393d1f04ba0fd087710eb68c7869e126eea3814b32fd1e5b4"} Nov 28 13:33:11 crc kubenswrapper[4970]: I1128 13:33:11.756780 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"009f463d464c8cdf70440ead527a3c759246095a2f005612d80ba7c6af745d3a"} Nov 28 13:33:11 crc kubenswrapper[4970]: I1128 13:33:11.756791 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"1f771a95dbda1b0116af9bfc356483989a5634b2a8f963467c3834156b4baca7"} Nov 28 13:33:12 crc kubenswrapper[4970]: I1128 13:33:12.767740 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qf2x" event={"ID":"d8109ee2-cb6f-4706-a9d0-93fbec9b4234","Type":"ContainerStarted","Data":"20b3d0ebf9a04859c8725e4106c068804ad56ee372735fac69a6095cf56d21ad"} Nov 28 13:33:12 crc kubenswrapper[4970]: I1128 13:33:12.767982 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:33:12 crc kubenswrapper[4970]: I1128 13:33:12.812939 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7qf2x" podStartSLOduration=7.032044733 podStartE2EDuration="14.812918151s" podCreationTimestamp="2025-11-28 13:32:58 +0000 UTC" firstStartedPulling="2025-11-28 13:32:59.759601308 +0000 UTC m=+790.612483158" lastFinishedPulling="2025-11-28 13:33:07.540474776 +0000 UTC m=+798.393356576" observedRunningTime="2025-11-28 13:33:12.805645643 +0000 UTC m=+803.658527443" watchObservedRunningTime="2025-11-28 13:33:12.812918151 +0000 UTC m=+803.665799951" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.266092 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-lv47q"] Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.266893 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.268925 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-qp7t7" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.283139 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-lv47q"] Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.417074 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wcb\" (UniqueName: \"kubernetes.io/projected/d396c001-d2ca-45b7-94e5-4a5cd8f25435-kube-api-access-95wcb\") pod \"infra-operator-index-lv47q\" (UID: \"d396c001-d2ca-45b7-94e5-4a5cd8f25435\") " pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.518341 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95wcb\" (UniqueName: \"kubernetes.io/projected/d396c001-d2ca-45b7-94e5-4a5cd8f25435-kube-api-access-95wcb\") pod \"infra-operator-index-lv47q\" (UID: \"d396c001-d2ca-45b7-94e5-4a5cd8f25435\") " pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.552347 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wcb\" (UniqueName: \"kubernetes.io/projected/d396c001-d2ca-45b7-94e5-4a5cd8f25435-kube-api-access-95wcb\") pod \"infra-operator-index-lv47q\" (UID: \"d396c001-d2ca-45b7-94e5-4a5cd8f25435\") " pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.583490 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.810374 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.867819 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-lv47q"] Nov 28 13:33:13 crc kubenswrapper[4970]: I1128 13:33:13.872234 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:33:14 crc kubenswrapper[4970]: I1128 13:33:14.796085 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lv47q" event={"ID":"d396c001-d2ca-45b7-94e5-4a5cd8f25435","Type":"ContainerStarted","Data":"7b2f343f86350d7901b0e04e0b1052c165a2b532d27d1261c601988005e640bc"} Nov 28 13:33:15 crc kubenswrapper[4970]: I1128 13:33:15.801396 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lv47q" event={"ID":"d396c001-d2ca-45b7-94e5-4a5cd8f25435","Type":"ContainerStarted","Data":"6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6"} Nov 28 13:33:15 crc kubenswrapper[4970]: I1128 13:33:15.815957 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-lv47q" podStartSLOduration=1.6569057809999999 podStartE2EDuration="2.81593452s" podCreationTimestamp="2025-11-28 13:33:13 +0000 UTC" firstStartedPulling="2025-11-28 13:33:13.879569505 +0000 UTC m=+804.732451305" lastFinishedPulling="2025-11-28 13:33:15.038598214 +0000 UTC m=+805.891480044" observedRunningTime="2025-11-28 13:33:15.812556004 +0000 UTC m=+806.665437854" watchObservedRunningTime="2025-11-28 13:33:15.81593452 +0000 UTC m=+806.668816330" Nov 28 13:33:17 crc kubenswrapper[4970]: I1128 13:33:17.467176 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-lv47q"] Nov 28 13:33:17 crc kubenswrapper[4970]: I1128 13:33:17.813971 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-lv47q" podUID="d396c001-d2ca-45b7-94e5-4a5cd8f25435" containerName="registry-server" containerID="cri-o://6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6" gracePeriod=2 Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.063671 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-jfj76"] Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.064971 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.072391 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-jfj76"] Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.121938 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv47p\" (UniqueName: \"kubernetes.io/projected/defb0064-6731-4f52-872a-a26d8e82dd41-kube-api-access-nv47p\") pod \"infra-operator-index-jfj76\" (UID: \"defb0064-6731-4f52-872a-a26d8e82dd41\") " pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.225227 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv47p\" (UniqueName: \"kubernetes.io/projected/defb0064-6731-4f52-872a-a26d8e82dd41-kube-api-access-nv47p\") pod \"infra-operator-index-jfj76\" (UID: \"defb0064-6731-4f52-872a-a26d8e82dd41\") " pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.250184 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv47p\" (UniqueName: \"kubernetes.io/projected/defb0064-6731-4f52-872a-a26d8e82dd41-kube-api-access-nv47p\") pod \"infra-operator-index-jfj76\" (UID: \"defb0064-6731-4f52-872a-a26d8e82dd41\") " pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.368608 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.427016 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95wcb\" (UniqueName: \"kubernetes.io/projected/d396c001-d2ca-45b7-94e5-4a5cd8f25435-kube-api-access-95wcb\") pod \"d396c001-d2ca-45b7-94e5-4a5cd8f25435\" (UID: \"d396c001-d2ca-45b7-94e5-4a5cd8f25435\") " Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.431072 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d396c001-d2ca-45b7-94e5-4a5cd8f25435-kube-api-access-95wcb" (OuterVolumeSpecName: "kube-api-access-95wcb") pod "d396c001-d2ca-45b7-94e5-4a5cd8f25435" (UID: "d396c001-d2ca-45b7-94e5-4a5cd8f25435"). InnerVolumeSpecName "kube-api-access-95wcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.438352 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.528778 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95wcb\" (UniqueName: \"kubernetes.io/projected/d396c001-d2ca-45b7-94e5-4a5cd8f25435-kube-api-access-95wcb\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.821983 4970 generic.go:334] "Generic (PLEG): container finished" podID="d396c001-d2ca-45b7-94e5-4a5cd8f25435" containerID="6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6" exitCode=0 Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.822029 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lv47q" event={"ID":"d396c001-d2ca-45b7-94e5-4a5cd8f25435","Type":"ContainerDied","Data":"6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6"} Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.822051 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lv47q" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.822064 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lv47q" event={"ID":"d396c001-d2ca-45b7-94e5-4a5cd8f25435","Type":"ContainerDied","Data":"7b2f343f86350d7901b0e04e0b1052c165a2b532d27d1261c601988005e640bc"} Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.822086 4970 scope.go:117] "RemoveContainer" containerID="6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.846807 4970 scope.go:117] "RemoveContainer" containerID="6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6" Nov 28 13:33:18 crc kubenswrapper[4970]: E1128 13:33:18.847427 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6\": container with ID starting with 6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6 not found: ID does not exist" containerID="6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.847482 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6"} err="failed to get container status \"6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6\": rpc error: code = NotFound desc = could not find container \"6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6\": container with ID starting with 6733a08b3c4bbfb24fd908bd3077620165fd9514001ec693079f8141a9ff02a6 not found: ID does not exist" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.853671 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tdbd5" Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.859464 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-lv47q"] Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.864497 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-lv47q"] Nov 28 13:33:18 crc kubenswrapper[4970]: I1128 13:33:18.872422 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-jfj76"] Nov 28 13:33:18 crc kubenswrapper[4970]: W1128 13:33:18.879248 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefb0064_6731_4f52_872a_a26d8e82dd41.slice/crio-dab6ba14e2048c2b6cdf88d408edee3b4a19d3ce8e33e151f4823f686a70bd96 WatchSource:0}: Error finding container dab6ba14e2048c2b6cdf88d408edee3b4a19d3ce8e33e151f4823f686a70bd96: Status 404 returned error can't find the container with id dab6ba14e2048c2b6cdf88d408edee3b4a19d3ce8e33e151f4823f686a70bd96 Nov 28 13:33:19 crc kubenswrapper[4970]: I1128 13:33:19.390050 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d396c001-d2ca-45b7-94e5-4a5cd8f25435" path="/var/lib/kubelet/pods/d396c001-d2ca-45b7-94e5-4a5cd8f25435/volumes" Nov 28 13:33:19 crc kubenswrapper[4970]: I1128 13:33:19.828683 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-jfj76" event={"ID":"defb0064-6731-4f52-872a-a26d8e82dd41","Type":"ContainerStarted","Data":"dab6ba14e2048c2b6cdf88d408edee3b4a19d3ce8e33e151f4823f686a70bd96"} Nov 28 13:33:21 crc kubenswrapper[4970]: I1128 13:33:21.845881 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-jfj76" event={"ID":"defb0064-6731-4f52-872a-a26d8e82dd41","Type":"ContainerStarted","Data":"5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f"} Nov 28 13:33:21 crc kubenswrapper[4970]: I1128 13:33:21.865125 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-jfj76" podStartSLOduration=2.072876587 podStartE2EDuration="3.865104725s" podCreationTimestamp="2025-11-28 13:33:18 +0000 UTC" firstStartedPulling="2025-11-28 13:33:18.891078065 +0000 UTC m=+809.743959875" lastFinishedPulling="2025-11-28 13:33:20.683306173 +0000 UTC m=+811.536188013" observedRunningTime="2025-11-28 13:33:21.8603766 +0000 UTC m=+812.713258400" watchObservedRunningTime="2025-11-28 13:33:21.865104725 +0000 UTC m=+812.717986525" Nov 28 13:33:28 crc kubenswrapper[4970]: I1128 13:33:28.439286 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:28 crc kubenswrapper[4970]: I1128 13:33:28.439711 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:28 crc kubenswrapper[4970]: I1128 13:33:28.467846 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:28 crc kubenswrapper[4970]: I1128 13:33:28.812539 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7qf2x" Nov 28 13:33:28 crc kubenswrapper[4970]: I1128 13:33:28.926426 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.709558 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2"] Nov 28 13:33:30 crc kubenswrapper[4970]: E1128 13:33:30.710170 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d396c001-d2ca-45b7-94e5-4a5cd8f25435" containerName="registry-server" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.710192 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="d396c001-d2ca-45b7-94e5-4a5cd8f25435" containerName="registry-server" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.710400 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="d396c001-d2ca-45b7-94e5-4a5cd8f25435" containerName="registry-server" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.711531 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.715018 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-77hkb" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.719450 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2"] Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.901125 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.901305 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djwdj\" (UniqueName: \"kubernetes.io/projected/e958b4be-5525-4751-a2e5-feecdea9d82c-kube-api-access-djwdj\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:30 crc kubenswrapper[4970]: I1128 13:33:30.901368 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.003394 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.003887 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.004024 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djwdj\" (UniqueName: \"kubernetes.io/projected/e958b4be-5525-4751-a2e5-feecdea9d82c-kube-api-access-djwdj\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.004124 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.004669 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.029445 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djwdj\" (UniqueName: \"kubernetes.io/projected/e958b4be-5525-4751-a2e5-feecdea9d82c-kube-api-access-djwdj\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.328697 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.601916 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2"] Nov 28 13:33:31 crc kubenswrapper[4970]: I1128 13:33:31.915319 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" event={"ID":"e958b4be-5525-4751-a2e5-feecdea9d82c","Type":"ContainerStarted","Data":"d44c66c1a6ca4904f0cac909e67db41bb42ca0d39fb244c877c80a8500422185"} Nov 28 13:33:32 crc kubenswrapper[4970]: I1128 13:33:32.923700 4970 generic.go:334] "Generic (PLEG): container finished" podID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerID="7505a745e05d58ede03f331bb0333008a0cbdf23e25c01b4f6b3b7f0150d816a" exitCode=0 Nov 28 13:33:32 crc kubenswrapper[4970]: I1128 13:33:32.923779 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" event={"ID":"e958b4be-5525-4751-a2e5-feecdea9d82c","Type":"ContainerDied","Data":"7505a745e05d58ede03f331bb0333008a0cbdf23e25c01b4f6b3b7f0150d816a"} Nov 28 13:33:35 crc kubenswrapper[4970]: I1128 13:33:35.950115 4970 generic.go:334] "Generic (PLEG): container finished" podID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerID="bc0871b24a00f4320ff309bb37f630da06ef0fe23bdeefdd761346dd2ffed4b4" exitCode=0 Nov 28 13:33:35 crc kubenswrapper[4970]: I1128 13:33:35.950416 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" event={"ID":"e958b4be-5525-4751-a2e5-feecdea9d82c","Type":"ContainerDied","Data":"bc0871b24a00f4320ff309bb37f630da06ef0fe23bdeefdd761346dd2ffed4b4"} Nov 28 13:33:36 crc kubenswrapper[4970]: I1128 13:33:36.960278 4970 generic.go:334] "Generic (PLEG): container finished" podID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerID="f03ef2cea48b019af731ad7a6a03ffa6855789bef5b3a51ec250f24482175461" exitCode=0 Nov 28 13:33:36 crc kubenswrapper[4970]: I1128 13:33:36.960332 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" event={"ID":"e958b4be-5525-4751-a2e5-feecdea9d82c","Type":"ContainerDied","Data":"f03ef2cea48b019af731ad7a6a03ffa6855789bef5b3a51ec250f24482175461"} Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.344288 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.514607 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-bundle\") pod \"e958b4be-5525-4751-a2e5-feecdea9d82c\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.514686 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-util\") pod \"e958b4be-5525-4751-a2e5-feecdea9d82c\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.514741 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djwdj\" (UniqueName: \"kubernetes.io/projected/e958b4be-5525-4751-a2e5-feecdea9d82c-kube-api-access-djwdj\") pod \"e958b4be-5525-4751-a2e5-feecdea9d82c\" (UID: \"e958b4be-5525-4751-a2e5-feecdea9d82c\") " Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.516380 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-bundle" (OuterVolumeSpecName: "bundle") pod "e958b4be-5525-4751-a2e5-feecdea9d82c" (UID: "e958b4be-5525-4751-a2e5-feecdea9d82c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.523087 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e958b4be-5525-4751-a2e5-feecdea9d82c-kube-api-access-djwdj" (OuterVolumeSpecName: "kube-api-access-djwdj") pod "e958b4be-5525-4751-a2e5-feecdea9d82c" (UID: "e958b4be-5525-4751-a2e5-feecdea9d82c"). InnerVolumeSpecName "kube-api-access-djwdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.531804 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-util" (OuterVolumeSpecName: "util") pod "e958b4be-5525-4751-a2e5-feecdea9d82c" (UID: "e958b4be-5525-4751-a2e5-feecdea9d82c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.615998 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djwdj\" (UniqueName: \"kubernetes.io/projected/e958b4be-5525-4751-a2e5-feecdea9d82c-kube-api-access-djwdj\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.616036 4970 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.616049 4970 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e958b4be-5525-4751-a2e5-feecdea9d82c-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.978730 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" event={"ID":"e958b4be-5525-4751-a2e5-feecdea9d82c","Type":"ContainerDied","Data":"d44c66c1a6ca4904f0cac909e67db41bb42ca0d39fb244c877c80a8500422185"} Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.978771 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d44c66c1a6ca4904f0cac909e67db41bb42ca0d39fb244c877c80a8500422185" Nov 28 13:33:38 crc kubenswrapper[4970]: I1128 13:33:38.978782 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.820924 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25"] Nov 28 13:33:42 crc kubenswrapper[4970]: E1128 13:33:42.822759 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="extract" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.822806 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="extract" Nov 28 13:33:42 crc kubenswrapper[4970]: E1128 13:33:42.822823 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="pull" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.822833 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="pull" Nov 28 13:33:42 crc kubenswrapper[4970]: E1128 13:33:42.822858 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="util" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.822866 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="util" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.823010 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" containerName="extract" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.823867 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.826319 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hdg2r" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.827712 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.840416 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25"] Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.991162 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-apiservice-cert\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.991347 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-webhook-cert\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:42 crc kubenswrapper[4970]: I1128 13:33:42.991480 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2cm\" (UniqueName: \"kubernetes.io/projected/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-kube-api-access-zq2cm\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.091973 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-webhook-cert\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.092068 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2cm\" (UniqueName: \"kubernetes.io/projected/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-kube-api-access-zq2cm\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.092130 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-apiservice-cert\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.100077 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-webhook-cert\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.107848 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-apiservice-cert\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.110907 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2cm\" (UniqueName: \"kubernetes.io/projected/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-kube-api-access-zq2cm\") pod \"infra-operator-controller-manager-7c855bfbc4-jhn25\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.140065 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:43 crc kubenswrapper[4970]: W1128 13:33:43.413390 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b95186_66e8_493a_8eb9_79e4cd5b5a7d.slice/crio-e46e313c9050188c19cd9c042fa3abad000bb012f358d6d145ce1e1e3f07f09c WatchSource:0}: Error finding container e46e313c9050188c19cd9c042fa3abad000bb012f358d6d145ce1e1e3f07f09c: Status 404 returned error can't find the container with id e46e313c9050188c19cd9c042fa3abad000bb012f358d6d145ce1e1e3f07f09c Nov 28 13:33:43 crc kubenswrapper[4970]: I1128 13:33:43.415891 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25"] Nov 28 13:33:44 crc kubenswrapper[4970]: I1128 13:33:44.008636 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" event={"ID":"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d","Type":"ContainerStarted","Data":"e46e313c9050188c19cd9c042fa3abad000bb012f358d6d145ce1e1e3f07f09c"} Nov 28 13:33:46 crc kubenswrapper[4970]: I1128 13:33:46.021115 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" event={"ID":"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d","Type":"ContainerStarted","Data":"26166f82d6294755ed8cbbbf2de0a966eac1347f297da23ff2d500d91152689a"} Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.647401 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.649328 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.654236 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-wzmfx" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.660995 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.661094 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.661767 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.661858 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.670412 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.675747 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.677678 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.679735 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.680794 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.696995 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.705170 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.764318 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-kolla-config\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.764374 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.764580 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-default\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.764689 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.764832 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.764883 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7kf\" (UniqueName: \"kubernetes.io/projected/1474c5bc-29c4-4da3-b2e9-900196941f19-kube-api-access-ck7kf\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866450 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866510 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-kolla-config\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866541 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-generated\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866566 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-kolla-config\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866583 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866601 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866742 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5cz5\" (UniqueName: \"kubernetes.io/projected/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kube-api-access-n5cz5\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866768 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-operator-scripts\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866789 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kolla-config\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866816 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-default\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866843 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg244\" (UniqueName: \"kubernetes.io/projected/70137649-04fe-46dd-94ef-03a6ab19aecd-kube-api-access-fg244\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866867 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-operator-scripts\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866893 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-default\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866920 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-default\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866951 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-generated\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.866978 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.867028 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.867050 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7kf\" (UniqueName: \"kubernetes.io/projected/1474c5bc-29c4-4da3-b2e9-900196941f19-kube-api-access-ck7kf\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.868247 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.868852 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-default\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.869096 4970 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") device mount path \"/mnt/openstack/pv01\"" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.871187 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.871486 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-kolla-config\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.887051 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7kf\" (UniqueName: \"kubernetes.io/projected/1474c5bc-29c4-4da3-b2e9-900196941f19-kube-api-access-ck7kf\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.888205 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969033 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969125 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-kolla-config\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969175 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-generated\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969242 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969303 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5cz5\" (UniqueName: \"kubernetes.io/projected/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kube-api-access-n5cz5\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969337 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-operator-scripts\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969374 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kolla-config\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969415 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-default\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969453 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg244\" (UniqueName: \"kubernetes.io/projected/70137649-04fe-46dd-94ef-03a6ab19aecd-kube-api-access-fg244\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969489 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-operator-scripts\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969530 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-default\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969577 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-generated\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.969976 4970 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") device mount path \"/mnt/openstack/pv07\"" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.970183 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-generated\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.970301 4970 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") device mount path \"/mnt/openstack/pv08\"" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.971293 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-generated\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.971595 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-kolla-config\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.972153 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-operator-scripts\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.973110 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.973226 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-operator-scripts\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.973723 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-default\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.973784 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-default\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:47 crc kubenswrapper[4970]: I1128 13:33:47.978451 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kolla-config\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.003065 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.012814 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5cz5\" (UniqueName: \"kubernetes.io/projected/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kube-api-access-n5cz5\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.013100 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg244\" (UniqueName: \"kubernetes.io/projected/70137649-04fe-46dd-94ef-03a6ab19aecd-kube-api-access-fg244\") pod \"openstack-galera-2\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.022541 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-1\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.024552 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.307475 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:33:48 crc kubenswrapper[4970]: I1128 13:33:48.466023 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:33:50 crc kubenswrapper[4970]: I1128 13:33:50.110076 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1474c5bc-29c4-4da3-b2e9-900196941f19","Type":"ContainerStarted","Data":"809fbf33122f8f03413edde274e24bfe122a29774186624f0ca7d1c7782f2c76"} Nov 28 13:33:50 crc kubenswrapper[4970]: I1128 13:33:50.233691 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:33:50 crc kubenswrapper[4970]: I1128 13:33:50.270057 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:33:50 crc kubenswrapper[4970]: W1128 13:33:50.277611 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a4491a2_79c8_4e5b_8f2f_6c8182f09885.slice/crio-9378584313b1e286fd6e0e0bf88cc692e44a29ea5a897d1e83aeb1e01904834a WatchSource:0}: Error finding container 9378584313b1e286fd6e0e0bf88cc692e44a29ea5a897d1e83aeb1e01904834a: Status 404 returned error can't find the container with id 9378584313b1e286fd6e0e0bf88cc692e44a29ea5a897d1e83aeb1e01904834a Nov 28 13:33:51 crc kubenswrapper[4970]: I1128 13:33:51.119597 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"70137649-04fe-46dd-94ef-03a6ab19aecd","Type":"ContainerStarted","Data":"c3384bb2c9f8f9bc54e7a5e36e7262765934d55b67e7f922ed1d5801b4726eed"} Nov 28 13:33:51 crc kubenswrapper[4970]: I1128 13:33:51.121519 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" event={"ID":"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d","Type":"ContainerStarted","Data":"fcdc7b828b38f9346c1685bad3df9047ffb317c10c55ae7e67b569220f83b16e"} Nov 28 13:33:51 crc kubenswrapper[4970]: I1128 13:33:51.121798 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:51 crc kubenswrapper[4970]: I1128 13:33:51.124207 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"3a4491a2-79c8-4e5b-8f2f-6c8182f09885","Type":"ContainerStarted","Data":"9378584313b1e286fd6e0e0bf88cc692e44a29ea5a897d1e83aeb1e01904834a"} Nov 28 13:33:51 crc kubenswrapper[4970]: I1128 13:33:51.125491 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:33:51 crc kubenswrapper[4970]: I1128 13:33:51.153598 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" podStartSLOduration=2.332955711 podStartE2EDuration="9.153578703s" podCreationTimestamp="2025-11-28 13:33:42 +0000 UTC" firstStartedPulling="2025-11-28 13:33:43.416796819 +0000 UTC m=+834.269678619" lastFinishedPulling="2025-11-28 13:33:50.237419811 +0000 UTC m=+841.090301611" observedRunningTime="2025-11-28 13:33:51.148632001 +0000 UTC m=+842.001513821" watchObservedRunningTime="2025-11-28 13:33:51.153578703 +0000 UTC m=+842.006460503" Nov 28 13:33:58 crc kubenswrapper[4970]: I1128 13:33:58.866829 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-kjd76"] Nov 28 13:33:58 crc kubenswrapper[4970]: I1128 13:33:58.868048 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:33:58 crc kubenswrapper[4970]: I1128 13:33:58.870112 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-xwxdp" Nov 28 13:33:58 crc kubenswrapper[4970]: I1128 13:33:58.879462 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-kjd76"] Nov 28 13:33:58 crc kubenswrapper[4970]: I1128 13:33:58.957083 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gq6x\" (UniqueName: \"kubernetes.io/projected/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394-kube-api-access-8gq6x\") pod \"rabbitmq-cluster-operator-index-kjd76\" (UID: \"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394\") " pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:33:59 crc kubenswrapper[4970]: I1128 13:33:59.059035 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gq6x\" (UniqueName: \"kubernetes.io/projected/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394-kube-api-access-8gq6x\") pod \"rabbitmq-cluster-operator-index-kjd76\" (UID: \"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394\") " pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:33:59 crc kubenswrapper[4970]: I1128 13:33:59.083501 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gq6x\" (UniqueName: \"kubernetes.io/projected/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394-kube-api-access-8gq6x\") pod \"rabbitmq-cluster-operator-index-kjd76\" (UID: \"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394\") " pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:33:59 crc kubenswrapper[4970]: I1128 13:33:59.184341 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.491363 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bzs99"] Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.495199 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.506108 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzs99"] Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.580091 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-utilities\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.580225 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6fd\" (UniqueName: \"kubernetes.io/projected/4855b957-473e-4623-bd8d-faf428f492da-kube-api-access-8g6fd\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.580258 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-catalog-content\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.681809 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6fd\" (UniqueName: \"kubernetes.io/projected/4855b957-473e-4623-bd8d-faf428f492da-kube-api-access-8g6fd\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.681871 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-catalog-content\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.681938 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-utilities\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.682581 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-utilities\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.682811 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-catalog-content\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.711151 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6fd\" (UniqueName: \"kubernetes.io/projected/4855b957-473e-4623-bd8d-faf428f492da-kube-api-access-8g6fd\") pod \"redhat-marketplace-bzs99\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:05 crc kubenswrapper[4970]: I1128 13:34:05.844172 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.320656 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.320827 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5cz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-1_keystone-kuttl-tests(3a4491a2-79c8-4e5b-8f2f-6c8182f09885): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.322062 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/openstack-galera-1" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.350108 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.350362 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ck7kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_keystone-kuttl-tests(1474c5bc-29c4-4da3-b2e9-900196941f19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.351844 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/openstack-galera-0" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.380373 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.380794 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fg244,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-2_keystone-kuttl-tests(70137649-04fe-46dd-94ef-03a6ab19aecd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 28 13:34:08 crc kubenswrapper[4970]: E1128 13:34:08.382342 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/openstack-galera-2" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" Nov 28 13:34:08 crc kubenswrapper[4970]: I1128 13:34:08.553649 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzs99"] Nov 28 13:34:08 crc kubenswrapper[4970]: I1128 13:34:08.869797 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-kjd76"] Nov 28 13:34:08 crc kubenswrapper[4970]: W1128 13:34:08.898836 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8bd99b0_8e47_4a38_bedc_aad5cf3a7394.slice/crio-0428e4dd9f595a76ea17a6af15e3dbab5d4935a3a49967745edb48c3d9ef52ba WatchSource:0}: Error finding container 0428e4dd9f595a76ea17a6af15e3dbab5d4935a3a49967745edb48c3d9ef52ba: Status 404 returned error can't find the container with id 0428e4dd9f595a76ea17a6af15e3dbab5d4935a3a49967745edb48c3d9ef52ba Nov 28 13:34:09 crc kubenswrapper[4970]: I1128 13:34:09.287210 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" event={"ID":"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394","Type":"ContainerStarted","Data":"0428e4dd9f595a76ea17a6af15e3dbab5d4935a3a49967745edb48c3d9ef52ba"} Nov 28 13:34:09 crc kubenswrapper[4970]: I1128 13:34:09.288741 4970 generic.go:334] "Generic (PLEG): container finished" podID="4855b957-473e-4623-bd8d-faf428f492da" containerID="12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375" exitCode=0 Nov 28 13:34:09 crc kubenswrapper[4970]: I1128 13:34:09.289625 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerDied","Data":"12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375"} Nov 28 13:34:09 crc kubenswrapper[4970]: I1128 13:34:09.289690 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerStarted","Data":"260d82bc2400bfe67d1423ac4f177314495ba809daf687e2a1c6c01ac9128404"} Nov 28 13:34:09 crc kubenswrapper[4970]: E1128 13:34:09.290596 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="keystone-kuttl-tests/openstack-galera-0" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" Nov 28 13:34:09 crc kubenswrapper[4970]: E1128 13:34:09.291262 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="keystone-kuttl-tests/openstack-galera-2" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" Nov 28 13:34:09 crc kubenswrapper[4970]: E1128 13:34:09.291992 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="keystone-kuttl-tests/openstack-galera-1" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" Nov 28 13:34:13 crc kubenswrapper[4970]: I1128 13:34:13.318541 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" event={"ID":"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394","Type":"ContainerStarted","Data":"bd0a1d0b1c41038773d89480fb84f837c99ab547ebacd02a7ca87bbd3cc5fccc"} Nov 28 13:34:13 crc kubenswrapper[4970]: I1128 13:34:13.333801 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" podStartSLOduration=11.268811967 podStartE2EDuration="15.333778302s" podCreationTimestamp="2025-11-28 13:33:58 +0000 UTC" firstStartedPulling="2025-11-28 13:34:08.905081586 +0000 UTC m=+859.757963426" lastFinishedPulling="2025-11-28 13:34:12.970047921 +0000 UTC m=+863.822929761" observedRunningTime="2025-11-28 13:34:13.332427374 +0000 UTC m=+864.185309194" watchObservedRunningTime="2025-11-28 13:34:13.333778302 +0000 UTC m=+864.186660132" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.341350 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerStarted","Data":"b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251"} Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.683187 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.684399 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.686957 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-hznkr" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.687690 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.718785 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.776064 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-config-data\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.776168 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kolla-config\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.776414 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds9v5\" (UniqueName: \"kubernetes.io/projected/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kube-api-access-ds9v5\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.878076 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-config-data\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.878456 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kolla-config\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.878612 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds9v5\" (UniqueName: \"kubernetes.io/projected/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kube-api-access-ds9v5\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.879673 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kolla-config\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.879698 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-config-data\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:16 crc kubenswrapper[4970]: I1128 13:34:16.908840 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds9v5\" (UniqueName: \"kubernetes.io/projected/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kube-api-access-ds9v5\") pod \"memcached-0\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:17 crc kubenswrapper[4970]: I1128 13:34:17.013514 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:17 crc kubenswrapper[4970]: I1128 13:34:17.357057 4970 generic.go:334] "Generic (PLEG): container finished" podID="4855b957-473e-4623-bd8d-faf428f492da" containerID="b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251" exitCode=0 Nov 28 13:34:17 crc kubenswrapper[4970]: I1128 13:34:17.357153 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerDied","Data":"b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251"} Nov 28 13:34:17 crc kubenswrapper[4970]: I1128 13:34:17.656871 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:34:17 crc kubenswrapper[4970]: W1128 13:34:17.670306 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb351352a_c436_4df2_9d43_f7dde4bb6a8a.slice/crio-b2e066cac3acf317439141be0b6c32251c9409762ccb261a91bc49eafd7b8ae5 WatchSource:0}: Error finding container b2e066cac3acf317439141be0b6c32251c9409762ccb261a91bc49eafd7b8ae5: Status 404 returned error can't find the container with id b2e066cac3acf317439141be0b6c32251c9409762ccb261a91bc49eafd7b8ae5 Nov 28 13:34:18 crc kubenswrapper[4970]: I1128 13:34:18.367658 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerStarted","Data":"882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb"} Nov 28 13:34:18 crc kubenswrapper[4970]: I1128 13:34:18.370130 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b351352a-c436-4df2-9d43-f7dde4bb6a8a","Type":"ContainerStarted","Data":"b2e066cac3acf317439141be0b6c32251c9409762ccb261a91bc49eafd7b8ae5"} Nov 28 13:34:18 crc kubenswrapper[4970]: I1128 13:34:18.403076 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bzs99" podStartSLOduration=4.757109141 podStartE2EDuration="13.403008518s" podCreationTimestamp="2025-11-28 13:34:05 +0000 UTC" firstStartedPulling="2025-11-28 13:34:09.291561383 +0000 UTC m=+860.144443213" lastFinishedPulling="2025-11-28 13:34:17.93746075 +0000 UTC m=+868.790342590" observedRunningTime="2025-11-28 13:34:18.3963106 +0000 UTC m=+869.249192470" watchObservedRunningTime="2025-11-28 13:34:18.403008518 +0000 UTC m=+869.255890348" Nov 28 13:34:19 crc kubenswrapper[4970]: I1128 13:34:19.185431 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:34:19 crc kubenswrapper[4970]: I1128 13:34:19.185528 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:34:19 crc kubenswrapper[4970]: I1128 13:34:19.210046 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:34:19 crc kubenswrapper[4970]: I1128 13:34:19.437630 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:34:21 crc kubenswrapper[4970]: I1128 13:34:21.396837 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b351352a-c436-4df2-9d43-f7dde4bb6a8a","Type":"ContainerStarted","Data":"d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a"} Nov 28 13:34:21 crc kubenswrapper[4970]: I1128 13:34:21.397434 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:21 crc kubenswrapper[4970]: I1128 13:34:21.423569 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=2.870467038 podStartE2EDuration="5.423540737s" podCreationTimestamp="2025-11-28 13:34:16 +0000 UTC" firstStartedPulling="2025-11-28 13:34:17.67712963 +0000 UTC m=+868.530011440" lastFinishedPulling="2025-11-28 13:34:20.230203339 +0000 UTC m=+871.083085139" observedRunningTime="2025-11-28 13:34:21.420462561 +0000 UTC m=+872.273344371" watchObservedRunningTime="2025-11-28 13:34:21.423540737 +0000 UTC m=+872.276422567" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.072898 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5gwx8"] Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.074547 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.083168 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5gwx8"] Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.164145 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-utilities\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.164231 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-catalog-content\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.164345 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shns9\" (UniqueName: \"kubernetes.io/projected/3acf03f9-181c-43b8-bc25-24fac1b02311-kube-api-access-shns9\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.265903 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-utilities\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.266039 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-catalog-content\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.266083 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shns9\" (UniqueName: \"kubernetes.io/projected/3acf03f9-181c-43b8-bc25-24fac1b02311-kube-api-access-shns9\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.266882 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-utilities\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.266973 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-catalog-content\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.294684 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shns9\" (UniqueName: \"kubernetes.io/projected/3acf03f9-181c-43b8-bc25-24fac1b02311-kube-api-access-shns9\") pod \"certified-operators-5gwx8\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.395682 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:22 crc kubenswrapper[4970]: I1128 13:34:22.874883 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5gwx8"] Nov 28 13:34:22 crc kubenswrapper[4970]: W1128 13:34:22.885553 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3acf03f9_181c_43b8_bc25_24fac1b02311.slice/crio-ceec4c355a4bdb5c553a705ac65934c00d5c940330a42bff57d279832d7b3f79 WatchSource:0}: Error finding container ceec4c355a4bdb5c553a705ac65934c00d5c940330a42bff57d279832d7b3f79: Status 404 returned error can't find the container with id ceec4c355a4bdb5c553a705ac65934c00d5c940330a42bff57d279832d7b3f79 Nov 28 13:34:23 crc kubenswrapper[4970]: I1128 13:34:23.413032 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerStarted","Data":"ceec4c355a4bdb5c553a705ac65934c00d5c940330a42bff57d279832d7b3f79"} Nov 28 13:34:24 crc kubenswrapper[4970]: I1128 13:34:24.429658 4970 generic.go:334] "Generic (PLEG): container finished" podID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerID="e30a3669de731c4f4147a25d4fe5e330512dd7f077b43f812439e8edf62dafa3" exitCode=0 Nov 28 13:34:24 crc kubenswrapper[4970]: I1128 13:34:24.429780 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerDied","Data":"e30a3669de731c4f4147a25d4fe5e330512dd7f077b43f812439e8edf62dafa3"} Nov 28 13:34:25 crc kubenswrapper[4970]: I1128 13:34:25.440028 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"70137649-04fe-46dd-94ef-03a6ab19aecd","Type":"ContainerStarted","Data":"1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9"} Nov 28 13:34:25 crc kubenswrapper[4970]: I1128 13:34:25.442280 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1474c5bc-29c4-4da3-b2e9-900196941f19","Type":"ContainerStarted","Data":"7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b"} Nov 28 13:34:25 crc kubenswrapper[4970]: I1128 13:34:25.445669 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"3a4491a2-79c8-4e5b-8f2f-6c8182f09885","Type":"ContainerStarted","Data":"d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d"} Nov 28 13:34:25 crc kubenswrapper[4970]: I1128 13:34:25.845055 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:25 crc kubenswrapper[4970]: I1128 13:34:25.846926 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:25 crc kubenswrapper[4970]: I1128 13:34:25.897921 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:26 crc kubenswrapper[4970]: I1128 13:34:26.458615 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerStarted","Data":"936444902abed013b7375284637490464a7da41689ea144a4c7995e00905363d"} Nov 28 13:34:26 crc kubenswrapper[4970]: I1128 13:34:26.599851 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:27 crc kubenswrapper[4970]: I1128 13:34:27.015303 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:34:27 crc kubenswrapper[4970]: I1128 13:34:27.465717 4970 generic.go:334] "Generic (PLEG): container finished" podID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerID="936444902abed013b7375284637490464a7da41689ea144a4c7995e00905363d" exitCode=0 Nov 28 13:34:27 crc kubenswrapper[4970]: I1128 13:34:27.466453 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerDied","Data":"936444902abed013b7375284637490464a7da41689ea144a4c7995e00905363d"} Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.114033 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f"] Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.115184 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.128669 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f"] Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.129204 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-77hkb" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.282526 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.282608 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24fv\" (UniqueName: \"kubernetes.io/projected/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-kube-api-access-n24fv\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.282732 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.383697 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n24fv\" (UniqueName: \"kubernetes.io/projected/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-kube-api-access-n24fv\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.383774 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.383842 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.384340 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.384548 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.407625 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24fv\" (UniqueName: \"kubernetes.io/projected/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-kube-api-access-n24fv\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.435272 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:28 crc kubenswrapper[4970]: I1128 13:34:28.944018 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f"] Nov 28 13:34:29 crc kubenswrapper[4970]: I1128 13:34:29.497692 4970 generic.go:334] "Generic (PLEG): container finished" podID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerID="ac839bd992f65562180f2ee157a333bad648870db1d9b2703daa6060d6d17b4b" exitCode=0 Nov 28 13:34:29 crc kubenswrapper[4970]: I1128 13:34:29.502718 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" event={"ID":"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a","Type":"ContainerDied","Data":"ac839bd992f65562180f2ee157a333bad648870db1d9b2703daa6060d6d17b4b"} Nov 28 13:34:29 crc kubenswrapper[4970]: I1128 13:34:29.502801 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" event={"ID":"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a","Type":"ContainerStarted","Data":"2e35fb1132a9eb4595697f1ad354c68c6b4c4ef5b5a3cdbfbdd430c7209e50ce"} Nov 28 13:34:29 crc kubenswrapper[4970]: I1128 13:34:29.512680 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerStarted","Data":"697391c8038ede24603f304d41c68e188ffc53f7be78152c46cccee74d1f17fa"} Nov 28 13:34:29 crc kubenswrapper[4970]: I1128 13:34:29.544303 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5gwx8" podStartSLOduration=3.446088294 podStartE2EDuration="7.544268848s" podCreationTimestamp="2025-11-28 13:34:22 +0000 UTC" firstStartedPulling="2025-11-28 13:34:24.43204499 +0000 UTC m=+875.284926830" lastFinishedPulling="2025-11-28 13:34:28.530225564 +0000 UTC m=+879.383107384" observedRunningTime="2025-11-28 13:34:29.538826766 +0000 UTC m=+880.391708596" watchObservedRunningTime="2025-11-28 13:34:29.544268848 +0000 UTC m=+880.397150648" Nov 28 13:34:30 crc kubenswrapper[4970]: I1128 13:34:30.459568 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzs99"] Nov 28 13:34:30 crc kubenswrapper[4970]: I1128 13:34:30.460079 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bzs99" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="registry-server" containerID="cri-o://882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb" gracePeriod=2 Nov 28 13:34:30 crc kubenswrapper[4970]: I1128 13:34:30.890021 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.046402 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-catalog-content\") pod \"4855b957-473e-4623-bd8d-faf428f492da\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.046465 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g6fd\" (UniqueName: \"kubernetes.io/projected/4855b957-473e-4623-bd8d-faf428f492da-kube-api-access-8g6fd\") pod \"4855b957-473e-4623-bd8d-faf428f492da\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.046575 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-utilities\") pod \"4855b957-473e-4623-bd8d-faf428f492da\" (UID: \"4855b957-473e-4623-bd8d-faf428f492da\") " Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.047826 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-utilities" (OuterVolumeSpecName: "utilities") pod "4855b957-473e-4623-bd8d-faf428f492da" (UID: "4855b957-473e-4623-bd8d-faf428f492da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.053105 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4855b957-473e-4623-bd8d-faf428f492da-kube-api-access-8g6fd" (OuterVolumeSpecName: "kube-api-access-8g6fd") pod "4855b957-473e-4623-bd8d-faf428f492da" (UID: "4855b957-473e-4623-bd8d-faf428f492da"). InnerVolumeSpecName "kube-api-access-8g6fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.080638 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4855b957-473e-4623-bd8d-faf428f492da" (UID: "4855b957-473e-4623-bd8d-faf428f492da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.148193 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.148248 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g6fd\" (UniqueName: \"kubernetes.io/projected/4855b957-473e-4623-bd8d-faf428f492da-kube-api-access-8g6fd\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.148264 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4855b957-473e-4623-bd8d-faf428f492da-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.528283 4970 generic.go:334] "Generic (PLEG): container finished" podID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerID="7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b" exitCode=0 Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.528359 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1474c5bc-29c4-4da3-b2e9-900196941f19","Type":"ContainerDied","Data":"7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b"} Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.531949 4970 generic.go:334] "Generic (PLEG): container finished" podID="4855b957-473e-4623-bd8d-faf428f492da" containerID="882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb" exitCode=0 Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.532012 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerDied","Data":"882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb"} Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.532037 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzs99" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.532189 4970 scope.go:117] "RemoveContainer" containerID="882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.532072 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzs99" event={"ID":"4855b957-473e-4623-bd8d-faf428f492da","Type":"ContainerDied","Data":"260d82bc2400bfe67d1423ac4f177314495ba809daf687e2a1c6c01ac9128404"} Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.535695 4970 generic.go:334] "Generic (PLEG): container finished" podID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerID="edbbe998ab544d7f6aaa537697b28bda62dfeb981181612410fb6aa92dfbc21a" exitCode=0 Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.535742 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" event={"ID":"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a","Type":"ContainerDied","Data":"edbbe998ab544d7f6aaa537697b28bda62dfeb981181612410fb6aa92dfbc21a"} Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.538180 4970 generic.go:334] "Generic (PLEG): container finished" podID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerID="d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d" exitCode=0 Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.538249 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"3a4491a2-79c8-4e5b-8f2f-6c8182f09885","Type":"ContainerDied","Data":"d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d"} Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.544528 4970 generic.go:334] "Generic (PLEG): container finished" podID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerID="1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9" exitCode=0 Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.544595 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"70137649-04fe-46dd-94ef-03a6ab19aecd","Type":"ContainerDied","Data":"1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9"} Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.702988 4970 scope.go:117] "RemoveContainer" containerID="b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.735017 4970 scope.go:117] "RemoveContainer" containerID="12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.737517 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzs99"] Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.743424 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzs99"] Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.766239 4970 scope.go:117] "RemoveContainer" containerID="882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb" Nov 28 13:34:31 crc kubenswrapper[4970]: E1128 13:34:31.766710 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb\": container with ID starting with 882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb not found: ID does not exist" containerID="882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.766792 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb"} err="failed to get container status \"882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb\": rpc error: code = NotFound desc = could not find container \"882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb\": container with ID starting with 882c9b9d606b0226d293ffbb3d2788ac4f762c129dc6679f4ab28c2cbf030beb not found: ID does not exist" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.766867 4970 scope.go:117] "RemoveContainer" containerID="b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251" Nov 28 13:34:31 crc kubenswrapper[4970]: E1128 13:34:31.767098 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251\": container with ID starting with b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251 not found: ID does not exist" containerID="b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.767118 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251"} err="failed to get container status \"b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251\": rpc error: code = NotFound desc = could not find container \"b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251\": container with ID starting with b2e0f36ce5dee91b7e0206f1c5e2b2ee3907031e564afaaf6f2dafe0e2aad251 not found: ID does not exist" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.767132 4970 scope.go:117] "RemoveContainer" containerID="12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375" Nov 28 13:34:31 crc kubenswrapper[4970]: E1128 13:34:31.767489 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375\": container with ID starting with 12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375 not found: ID does not exist" containerID="12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375" Nov 28 13:34:31 crc kubenswrapper[4970]: I1128 13:34:31.767506 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375"} err="failed to get container status \"12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375\": rpc error: code = NotFound desc = could not find container \"12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375\": container with ID starting with 12b58c3482a67843828736621bffa09d065c7e38f4bde9c58d9b86ecc385e375 not found: ID does not exist" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.396096 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.396466 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.439792 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.555041 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" event={"ID":"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a","Type":"ContainerStarted","Data":"779f4a67b747c4c45c7bea822c9931534849e1f5483d9ce33f185e777a205d73"} Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.557430 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"3a4491a2-79c8-4e5b-8f2f-6c8182f09885","Type":"ContainerStarted","Data":"22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79"} Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.560951 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"70137649-04fe-46dd-94ef-03a6ab19aecd","Type":"ContainerStarted","Data":"b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde"} Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.563370 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1474c5bc-29c4-4da3-b2e9-900196941f19","Type":"ContainerStarted","Data":"0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359"} Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.578987 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" podStartSLOduration=3.292737709 podStartE2EDuration="4.578965424s" podCreationTimestamp="2025-11-28 13:34:28 +0000 UTC" firstStartedPulling="2025-11-28 13:34:29.505485014 +0000 UTC m=+880.358366814" lastFinishedPulling="2025-11-28 13:34:30.791712729 +0000 UTC m=+881.644594529" observedRunningTime="2025-11-28 13:34:32.575929389 +0000 UTC m=+883.428811199" watchObservedRunningTime="2025-11-28 13:34:32.578965424 +0000 UTC m=+883.431847234" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.606828 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=11.723613827 podStartE2EDuration="46.606811673s" podCreationTimestamp="2025-11-28 13:33:46 +0000 UTC" firstStartedPulling="2025-11-28 13:33:50.248322263 +0000 UTC m=+841.101204063" lastFinishedPulling="2025-11-28 13:34:25.131520089 +0000 UTC m=+875.984401909" observedRunningTime="2025-11-28 13:34:32.602979066 +0000 UTC m=+883.455860896" watchObservedRunningTime="2025-11-28 13:34:32.606811673 +0000 UTC m=+883.459693473" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.625166 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=11.972172969 podStartE2EDuration="46.625150556s" podCreationTimestamp="2025-11-28 13:33:46 +0000 UTC" firstStartedPulling="2025-11-28 13:33:50.279524477 +0000 UTC m=+841.132406277" lastFinishedPulling="2025-11-28 13:34:24.932502014 +0000 UTC m=+875.785383864" observedRunningTime="2025-11-28 13:34:32.621759061 +0000 UTC m=+883.474640911" watchObservedRunningTime="2025-11-28 13:34:32.625150556 +0000 UTC m=+883.478032356" Nov 28 13:34:32 crc kubenswrapper[4970]: I1128 13:34:32.642736 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=11.503965138 podStartE2EDuration="46.642718957s" podCreationTimestamp="2025-11-28 13:33:46 +0000 UTC" firstStartedPulling="2025-11-28 13:33:49.9595107 +0000 UTC m=+840.812392500" lastFinishedPulling="2025-11-28 13:34:25.098264509 +0000 UTC m=+875.951146319" observedRunningTime="2025-11-28 13:34:32.640523415 +0000 UTC m=+883.493405215" watchObservedRunningTime="2025-11-28 13:34:32.642718957 +0000 UTC m=+883.495600767" Nov 28 13:34:33 crc kubenswrapper[4970]: I1128 13:34:33.389516 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4855b957-473e-4623-bd8d-faf428f492da" path="/var/lib/kubelet/pods/4855b957-473e-4623-bd8d-faf428f492da/volumes" Nov 28 13:34:33 crc kubenswrapper[4970]: I1128 13:34:33.572660 4970 generic.go:334] "Generic (PLEG): container finished" podID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerID="779f4a67b747c4c45c7bea822c9931534849e1f5483d9ce33f185e777a205d73" exitCode=0 Nov 28 13:34:33 crc kubenswrapper[4970]: I1128 13:34:33.572698 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" event={"ID":"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a","Type":"ContainerDied","Data":"779f4a67b747c4c45c7bea822c9931534849e1f5483d9ce33f185e777a205d73"} Nov 28 13:34:34 crc kubenswrapper[4970]: I1128 13:34:34.961605 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.053517 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-bundle\") pod \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.053787 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n24fv\" (UniqueName: \"kubernetes.io/projected/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-kube-api-access-n24fv\") pod \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.054099 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-util\") pod \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\" (UID: \"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a\") " Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.054432 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-bundle" (OuterVolumeSpecName: "bundle") pod "0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" (UID: "0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.054735 4970 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.378243 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-kube-api-access-n24fv" (OuterVolumeSpecName: "kube-api-access-n24fv") pod "0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" (UID: "0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a"). InnerVolumeSpecName "kube-api-access-n24fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.397200 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-util" (OuterVolumeSpecName: "util") pod "0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" (UID: "0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.460070 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n24fv\" (UniqueName: \"kubernetes.io/projected/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-kube-api-access-n24fv\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.460112 4970 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.589827 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" event={"ID":"0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a","Type":"ContainerDied","Data":"2e35fb1132a9eb4595697f1ad354c68c6b4c4ef5b5a3cdbfbdd430c7209e50ce"} Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.589874 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e35fb1132a9eb4595697f1ad354c68c6b4c4ef5b5a3cdbfbdd430c7209e50ce" Nov 28 13:34:35 crc kubenswrapper[4970]: I1128 13:34:35.589947 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f" Nov 28 13:34:37 crc kubenswrapper[4970]: I1128 13:34:37.973542 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:34:37 crc kubenswrapper[4970]: I1128 13:34:37.973604 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:34:38 crc kubenswrapper[4970]: I1128 13:34:38.025563 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:34:38 crc kubenswrapper[4970]: I1128 13:34:38.025646 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:34:38 crc kubenswrapper[4970]: I1128 13:34:38.308434 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:34:38 crc kubenswrapper[4970]: I1128 13:34:38.308504 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:34:42 crc kubenswrapper[4970]: I1128 13:34:42.490672 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.886751 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn"] Nov 28 13:34:43 crc kubenswrapper[4970]: E1128 13:34:43.887365 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="pull" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887380 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="pull" Nov 28 13:34:43 crc kubenswrapper[4970]: E1128 13:34:43.887400 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="registry-server" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887408 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="registry-server" Nov 28 13:34:43 crc kubenswrapper[4970]: E1128 13:34:43.887427 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="util" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887435 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="util" Nov 28 13:34:43 crc kubenswrapper[4970]: E1128 13:34:43.887453 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="extract" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887461 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="extract" Nov 28 13:34:43 crc kubenswrapper[4970]: E1128 13:34:43.887473 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="extract-utilities" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887483 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="extract-utilities" Nov 28 13:34:43 crc kubenswrapper[4970]: E1128 13:34:43.887496 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="extract-content" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887505 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="extract-content" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887634 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4855b957-473e-4623-bd8d-faf428f492da" containerName="registry-server" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.887652 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" containerName="extract" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.888184 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.901534 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-8w8xk" Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.902676 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn"] Nov 28 13:34:43 crc kubenswrapper[4970]: I1128 13:34:43.972991 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4k4x\" (UniqueName: \"kubernetes.io/projected/4e835339-2c2a-4195-a2b9-d76a7741f412-kube-api-access-k4k4x\") pod \"rabbitmq-cluster-operator-779fc9694b-7mpcn\" (UID: \"4e835339-2c2a-4195-a2b9-d76a7741f412\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:34:44 crc kubenswrapper[4970]: I1128 13:34:44.074581 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4k4x\" (UniqueName: \"kubernetes.io/projected/4e835339-2c2a-4195-a2b9-d76a7741f412-kube-api-access-k4k4x\") pod \"rabbitmq-cluster-operator-779fc9694b-7mpcn\" (UID: \"4e835339-2c2a-4195-a2b9-d76a7741f412\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:34:44 crc kubenswrapper[4970]: I1128 13:34:44.096310 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4k4x\" (UniqueName: \"kubernetes.io/projected/4e835339-2c2a-4195-a2b9-d76a7741f412-kube-api-access-k4k4x\") pod \"rabbitmq-cluster-operator-779fc9694b-7mpcn\" (UID: \"4e835339-2c2a-4195-a2b9-d76a7741f412\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:34:44 crc kubenswrapper[4970]: I1128 13:34:44.166938 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:34:44 crc kubenswrapper[4970]: I1128 13:34:44.228267 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:34:44 crc kubenswrapper[4970]: I1128 13:34:44.260021 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:34:44 crc kubenswrapper[4970]: E1128 13:34:44.510400 4970 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:36952->38.102.83.212:42501: write tcp 38.102.83.212:36952->38.102.83.212:42501: write: broken pipe Nov 28 13:34:44 crc kubenswrapper[4970]: I1128 13:34:44.665582 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn"] Nov 28 13:34:45 crc kubenswrapper[4970]: I1128 13:34:45.658289 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" event={"ID":"4e835339-2c2a-4195-a2b9-d76a7741f412","Type":"ContainerStarted","Data":"3d229bdff3d5e33032adbb2c7d4c6a406d72825d0ecb167f08ed024323122757"} Nov 28 13:34:46 crc kubenswrapper[4970]: E1128 13:34:46.731126 4970 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:45696->38.102.83.212:42501: write tcp 38.102.83.212:45696->38.102.83.212:42501: write: connection reset by peer Nov 28 13:34:46 crc kubenswrapper[4970]: E1128 13:34:46.731171 4970 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.212:45696->38.102.83.212:42501: write tcp 192.168.126.11:10250->192.168.126.11:50062: write: broken pipe Nov 28 13:34:47 crc kubenswrapper[4970]: I1128 13:34:47.061912 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5gwx8"] Nov 28 13:34:47 crc kubenswrapper[4970]: I1128 13:34:47.063390 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5gwx8" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="registry-server" containerID="cri-o://697391c8038ede24603f304d41c68e188ffc53f7be78152c46cccee74d1f17fa" gracePeriod=2 Nov 28 13:34:47 crc kubenswrapper[4970]: I1128 13:34:47.674544 4970 generic.go:334] "Generic (PLEG): container finished" podID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerID="697391c8038ede24603f304d41c68e188ffc53f7be78152c46cccee74d1f17fa" exitCode=0 Nov 28 13:34:47 crc kubenswrapper[4970]: I1128 13:34:47.674820 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerDied","Data":"697391c8038ede24603f304d41c68e188ffc53f7be78152c46cccee74d1f17fa"} Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.275206 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.355610 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shns9\" (UniqueName: \"kubernetes.io/projected/3acf03f9-181c-43b8-bc25-24fac1b02311-kube-api-access-shns9\") pod \"3acf03f9-181c-43b8-bc25-24fac1b02311\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.355675 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-catalog-content\") pod \"3acf03f9-181c-43b8-bc25-24fac1b02311\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.355764 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-utilities\") pod \"3acf03f9-181c-43b8-bc25-24fac1b02311\" (UID: \"3acf03f9-181c-43b8-bc25-24fac1b02311\") " Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.359065 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-utilities" (OuterVolumeSpecName: "utilities") pod "3acf03f9-181c-43b8-bc25-24fac1b02311" (UID: "3acf03f9-181c-43b8-bc25-24fac1b02311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.370427 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acf03f9-181c-43b8-bc25-24fac1b02311-kube-api-access-shns9" (OuterVolumeSpecName: "kube-api-access-shns9") pod "3acf03f9-181c-43b8-bc25-24fac1b02311" (UID: "3acf03f9-181c-43b8-bc25-24fac1b02311"). InnerVolumeSpecName "kube-api-access-shns9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.435984 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3acf03f9-181c-43b8-bc25-24fac1b02311" (UID: "3acf03f9-181c-43b8-bc25-24fac1b02311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.457812 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.457854 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shns9\" (UniqueName: \"kubernetes.io/projected/3acf03f9-181c-43b8-bc25-24fac1b02311-kube-api-access-shns9\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.457869 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3acf03f9-181c-43b8-bc25-24fac1b02311-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.693047 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gwx8" event={"ID":"3acf03f9-181c-43b8-bc25-24fac1b02311","Type":"ContainerDied","Data":"ceec4c355a4bdb5c553a705ac65934c00d5c940330a42bff57d279832d7b3f79"} Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.693483 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gwx8" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.693591 4970 scope.go:117] "RemoveContainer" containerID="697391c8038ede24603f304d41c68e188ffc53f7be78152c46cccee74d1f17fa" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.729352 4970 scope.go:117] "RemoveContainer" containerID="936444902abed013b7375284637490464a7da41689ea144a4c7995e00905363d" Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.729533 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5gwx8"] Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.744582 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5gwx8"] Nov 28 13:34:49 crc kubenswrapper[4970]: I1128 13:34:49.769234 4970 scope.go:117] "RemoveContainer" containerID="e30a3669de731c4f4147a25d4fe5e330512dd7f077b43f812439e8edf62dafa3" Nov 28 13:34:50 crc kubenswrapper[4970]: I1128 13:34:50.700337 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" event={"ID":"4e835339-2c2a-4195-a2b9-d76a7741f412","Type":"ContainerStarted","Data":"733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd"} Nov 28 13:34:50 crc kubenswrapper[4970]: I1128 13:34:50.716643 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" podStartSLOduration=2.720960838 podStartE2EDuration="7.716627337s" podCreationTimestamp="2025-11-28 13:34:43 +0000 UTC" firstStartedPulling="2025-11-28 13:34:44.675418353 +0000 UTC m=+895.528300153" lastFinishedPulling="2025-11-28 13:34:49.671084852 +0000 UTC m=+900.523966652" observedRunningTime="2025-11-28 13:34:50.712235005 +0000 UTC m=+901.565116825" watchObservedRunningTime="2025-11-28 13:34:50.716627337 +0000 UTC m=+901.569509137" Nov 28 13:34:51 crc kubenswrapper[4970]: I1128 13:34:51.333695 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:34:51 crc kubenswrapper[4970]: I1128 13:34:51.333800 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:34:51 crc kubenswrapper[4970]: I1128 13:34:51.392259 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" path="/var/lib/kubelet/pods/3acf03f9-181c-43b8-bc25-24fac1b02311/volumes" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.079064 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:34:54 crc kubenswrapper[4970]: E1128 13:34:54.080032 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="registry-server" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.080058 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="registry-server" Nov 28 13:34:54 crc kubenswrapper[4970]: E1128 13:34:54.080110 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="extract-content" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.080124 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="extract-content" Nov 28 13:34:54 crc kubenswrapper[4970]: E1128 13:34:54.080158 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="extract-utilities" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.080174 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="extract-utilities" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.088086 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acf03f9-181c-43b8-bc25-24fac1b02311" containerName="registry-server" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.090286 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.093303 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.093835 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-6nl7l" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.094030 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.094108 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.094197 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.107347 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122273 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4d4\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-kube-api-access-2p4d4\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122347 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfa2f2ae-c626-4fd8-a04c-a762e271a467-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122390 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfa2f2ae-c626-4fd8-a04c-a762e271a467-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122433 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfa2f2ae-c626-4fd8-a04c-a762e271a467-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122458 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-170f589f-095f-44f6-a9f9-fad686c8f582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122490 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122518 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.122557 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223642 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfa2f2ae-c626-4fd8-a04c-a762e271a467-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223708 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfa2f2ae-c626-4fd8-a04c-a762e271a467-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223762 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfa2f2ae-c626-4fd8-a04c-a762e271a467-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223791 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-170f589f-095f-44f6-a9f9-fad686c8f582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223830 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223879 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223926 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.223952 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p4d4\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-kube-api-access-2p4d4\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.224875 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.224955 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.225450 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfa2f2ae-c626-4fd8-a04c-a762e271a467-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.230808 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfa2f2ae-c626-4fd8-a04c-a762e271a467-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.230839 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.231202 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfa2f2ae-c626-4fd8-a04c-a762e271a467-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.231334 4970 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.231369 4970 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-170f589f-095f-44f6-a9f9-fad686c8f582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/52e8b6086657b6813a6e9af6c31c9a1a4d9da024cf4db0ee200a8bb0541eea9c/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.247917 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p4d4\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-kube-api-access-2p4d4\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.251467 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-170f589f-095f-44f6-a9f9-fad686c8f582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") pod \"rabbitmq-server-0\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.446902 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:34:54 crc kubenswrapper[4970]: I1128 13:34:54.964762 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:34:55 crc kubenswrapper[4970]: I1128 13:34:55.733558 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"dfa2f2ae-c626-4fd8-a04c-a762e271a467","Type":"ContainerStarted","Data":"0ed3cb1adb1799e5db85e4864dc08df31995ccf37f8a80a413836ad153021cc4"} Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.676384 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-d6xcn"] Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.677611 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.681026 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-tm6v7" Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.700743 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-d6xcn"] Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.763952 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvl6w\" (UniqueName: \"kubernetes.io/projected/a65f142b-ad4e-4901-9ca1-8d27e66fc59c-kube-api-access-bvl6w\") pod \"keystone-operator-index-d6xcn\" (UID: \"a65f142b-ad4e-4901-9ca1-8d27e66fc59c\") " pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.865130 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvl6w\" (UniqueName: \"kubernetes.io/projected/a65f142b-ad4e-4901-9ca1-8d27e66fc59c-kube-api-access-bvl6w\") pod \"keystone-operator-index-d6xcn\" (UID: \"a65f142b-ad4e-4901-9ca1-8d27e66fc59c\") " pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.883552 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvl6w\" (UniqueName: \"kubernetes.io/projected/a65f142b-ad4e-4901-9ca1-8d27e66fc59c-kube-api-access-bvl6w\") pod \"keystone-operator-index-d6xcn\" (UID: \"a65f142b-ad4e-4901-9ca1-8d27e66fc59c\") " pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:34:56 crc kubenswrapper[4970]: I1128 13:34:56.995934 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:34:58 crc kubenswrapper[4970]: I1128 13:34:58.152344 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="galera" probeResult="failure" output=< Nov 28 13:34:58 crc kubenswrapper[4970]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Nov 28 13:34:58 crc kubenswrapper[4970]: > Nov 28 13:35:01 crc kubenswrapper[4970]: I1128 13:35:01.329794 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:35:01 crc kubenswrapper[4970]: I1128 13:35:01.415601 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:35:05 crc kubenswrapper[4970]: I1128 13:35:05.598600 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-d6xcn"] Nov 28 13:35:06 crc kubenswrapper[4970]: I1128 13:35:06.818117 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-d6xcn" event={"ID":"a65f142b-ad4e-4901-9ca1-8d27e66fc59c","Type":"ContainerStarted","Data":"8475e5ee79ea457d3c68aa18543a573a86df620497bc9434436e2abdb5aa84d3"} Nov 28 13:35:06 crc kubenswrapper[4970]: I1128 13:35:06.894144 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:35:07 crc kubenswrapper[4970]: I1128 13:35:07.014854 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:35:07 crc kubenswrapper[4970]: I1128 13:35:07.825332 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"dfa2f2ae-c626-4fd8-a04c-a762e271a467","Type":"ContainerStarted","Data":"49d7a49f1b55c97d9b5689058fdb75d79f956f69e116755a84ab943c4e4da4ed"} Nov 28 13:35:09 crc kubenswrapper[4970]: I1128 13:35:09.856241 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-d6xcn" event={"ID":"a65f142b-ad4e-4901-9ca1-8d27e66fc59c","Type":"ContainerStarted","Data":"3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2"} Nov 28 13:35:09 crc kubenswrapper[4970]: I1128 13:35:09.881096 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-d6xcn" podStartSLOduration=11.140074538 podStartE2EDuration="13.881065952s" podCreationTimestamp="2025-11-28 13:34:56 +0000 UTC" firstStartedPulling="2025-11-28 13:35:05.939861788 +0000 UTC m=+916.792743588" lastFinishedPulling="2025-11-28 13:35:08.680853202 +0000 UTC m=+919.533735002" observedRunningTime="2025-11-28 13:35:09.874518618 +0000 UTC m=+920.727400458" watchObservedRunningTime="2025-11-28 13:35:09.881065952 +0000 UTC m=+920.733947792" Nov 28 13:35:16 crc kubenswrapper[4970]: I1128 13:35:16.996862 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:35:16 crc kubenswrapper[4970]: I1128 13:35:16.997398 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:35:17 crc kubenswrapper[4970]: I1128 13:35:17.028306 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:35:17 crc kubenswrapper[4970]: I1128 13:35:17.936490 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:35:21 crc kubenswrapper[4970]: I1128 13:35:21.333869 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:35:21 crc kubenswrapper[4970]: I1128 13:35:21.334269 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.726838 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs"] Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.728957 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.731787 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-77hkb" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.745778 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs"] Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.833393 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvq6\" (UniqueName: \"kubernetes.io/projected/cbe3a2fc-2688-413f-b4e1-9ba678488f30-kube-api-access-jjvq6\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.833458 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-util\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.833556 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-bundle\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.935284 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-bundle\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.935517 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvq6\" (UniqueName: \"kubernetes.io/projected/cbe3a2fc-2688-413f-b4e1-9ba678488f30-kube-api-access-jjvq6\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.935588 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-util\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.936321 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-bundle\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.936366 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-util\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:26 crc kubenswrapper[4970]: I1128 13:35:26.962014 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvq6\" (UniqueName: \"kubernetes.io/projected/cbe3a2fc-2688-413f-b4e1-9ba678488f30-kube-api-access-jjvq6\") pod \"0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:27 crc kubenswrapper[4970]: I1128 13:35:27.052147 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:27 crc kubenswrapper[4970]: I1128 13:35:27.297032 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs"] Nov 28 13:35:27 crc kubenswrapper[4970]: I1128 13:35:27.988465 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" event={"ID":"cbe3a2fc-2688-413f-b4e1-9ba678488f30","Type":"ContainerStarted","Data":"e41808980439353a60e2adce629dafd34aee25459453c160341c444ba232eb1e"} Nov 28 13:35:28 crc kubenswrapper[4970]: I1128 13:35:28.998791 4970 generic.go:334] "Generic (PLEG): container finished" podID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerID="150e51d52ee773b3c719f948fff10efa3b4ad3ac77b33ee8ae2891e1053d29df" exitCode=0 Nov 28 13:35:28 crc kubenswrapper[4970]: I1128 13:35:28.998903 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" event={"ID":"cbe3a2fc-2688-413f-b4e1-9ba678488f30","Type":"ContainerDied","Data":"150e51d52ee773b3c719f948fff10efa3b4ad3ac77b33ee8ae2891e1053d29df"} Nov 28 13:35:30 crc kubenswrapper[4970]: I1128 13:35:30.010634 4970 generic.go:334] "Generic (PLEG): container finished" podID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerID="cfd83d7153ecbe17ab6dbd52dc5de1437c77a0f3d12515724e782589a27622d1" exitCode=0 Nov 28 13:35:30 crc kubenswrapper[4970]: I1128 13:35:30.010712 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" event={"ID":"cbe3a2fc-2688-413f-b4e1-9ba678488f30","Type":"ContainerDied","Data":"cfd83d7153ecbe17ab6dbd52dc5de1437c77a0f3d12515724e782589a27622d1"} Nov 28 13:35:31 crc kubenswrapper[4970]: I1128 13:35:31.021602 4970 generic.go:334] "Generic (PLEG): container finished" podID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerID="4f5016f71be40c8fa0412d2d3623256df8cd9a5426d0c0b6ba7b92eac73c3a8a" exitCode=0 Nov 28 13:35:31 crc kubenswrapper[4970]: I1128 13:35:31.021693 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" event={"ID":"cbe3a2fc-2688-413f-b4e1-9ba678488f30","Type":"ContainerDied","Data":"4f5016f71be40c8fa0412d2d3623256df8cd9a5426d0c0b6ba7b92eac73c3a8a"} Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.299397 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.314656 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjvq6\" (UniqueName: \"kubernetes.io/projected/cbe3a2fc-2688-413f-b4e1-9ba678488f30-kube-api-access-jjvq6\") pod \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.314769 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-util\") pod \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.314804 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-bundle\") pod \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\" (UID: \"cbe3a2fc-2688-413f-b4e1-9ba678488f30\") " Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.318198 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-bundle" (OuterVolumeSpecName: "bundle") pod "cbe3a2fc-2688-413f-b4e1-9ba678488f30" (UID: "cbe3a2fc-2688-413f-b4e1-9ba678488f30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.326399 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe3a2fc-2688-413f-b4e1-9ba678488f30-kube-api-access-jjvq6" (OuterVolumeSpecName: "kube-api-access-jjvq6") pod "cbe3a2fc-2688-413f-b4e1-9ba678488f30" (UID: "cbe3a2fc-2688-413f-b4e1-9ba678488f30"). InnerVolumeSpecName "kube-api-access-jjvq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.342112 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-util" (OuterVolumeSpecName: "util") pod "cbe3a2fc-2688-413f-b4e1-9ba678488f30" (UID: "cbe3a2fc-2688-413f-b4e1-9ba678488f30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.416377 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjvq6\" (UniqueName: \"kubernetes.io/projected/cbe3a2fc-2688-413f-b4e1-9ba678488f30-kube-api-access-jjvq6\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.416526 4970 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-util\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:32 crc kubenswrapper[4970]: I1128 13:35:32.416604 4970 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe3a2fc-2688-413f-b4e1-9ba678488f30-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:35:33 crc kubenswrapper[4970]: I1128 13:35:33.039123 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" event={"ID":"cbe3a2fc-2688-413f-b4e1-9ba678488f30","Type":"ContainerDied","Data":"e41808980439353a60e2adce629dafd34aee25459453c160341c444ba232eb1e"} Nov 28 13:35:33 crc kubenswrapper[4970]: I1128 13:35:33.039185 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41808980439353a60e2adce629dafd34aee25459453c160341c444ba232eb1e" Nov 28 13:35:33 crc kubenswrapper[4970]: I1128 13:35:33.039317 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.109642 4970 generic.go:334] "Generic (PLEG): container finished" podID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerID="49d7a49f1b55c97d9b5689058fdb75d79f956f69e116755a84ab943c4e4da4ed" exitCode=0 Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.109718 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"dfa2f2ae-c626-4fd8-a04c-a762e271a467","Type":"ContainerDied","Data":"49d7a49f1b55c97d9b5689058fdb75d79f956f69e116755a84ab943c4e4da4ed"} Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.559284 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l"] Nov 28 13:35:41 crc kubenswrapper[4970]: E1128 13:35:41.559909 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="util" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.559930 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="util" Nov 28 13:35:41 crc kubenswrapper[4970]: E1128 13:35:41.559942 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="extract" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.559951 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="extract" Nov 28 13:35:41 crc kubenswrapper[4970]: E1128 13:35:41.559971 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="pull" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.559980 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="pull" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.560144 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" containerName="extract" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.560716 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.564762 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fkv22" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.567347 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.568712 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l"] Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.645378 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7b7k\" (UniqueName: \"kubernetes.io/projected/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-kube-api-access-b7b7k\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.645455 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-webhook-cert\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.645528 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-apiservice-cert\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.747113 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-apiservice-cert\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.747254 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7b7k\" (UniqueName: \"kubernetes.io/projected/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-kube-api-access-b7b7k\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.747297 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-webhook-cert\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.756071 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-webhook-cert\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.765485 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-apiservice-cert\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.771025 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7b7k\" (UniqueName: \"kubernetes.io/projected/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-kube-api-access-b7b7k\") pod \"keystone-operator-controller-manager-7456869864-hwf9l\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:41 crc kubenswrapper[4970]: I1128 13:35:41.876989 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:42 crc kubenswrapper[4970]: I1128 13:35:42.069999 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l"] Nov 28 13:35:42 crc kubenswrapper[4970]: W1128 13:35:42.075192 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bcf5bd0_9824_4c40_b009_8f2e50ad08b0.slice/crio-71ec5298b5cae377ff0092b1bd7f6809b43768e99329ffb1ec0266316a6c464d WatchSource:0}: Error finding container 71ec5298b5cae377ff0092b1bd7f6809b43768e99329ffb1ec0266316a6c464d: Status 404 returned error can't find the container with id 71ec5298b5cae377ff0092b1bd7f6809b43768e99329ffb1ec0266316a6c464d Nov 28 13:35:42 crc kubenswrapper[4970]: I1128 13:35:42.117787 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" event={"ID":"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0","Type":"ContainerStarted","Data":"71ec5298b5cae377ff0092b1bd7f6809b43768e99329ffb1ec0266316a6c464d"} Nov 28 13:35:42 crc kubenswrapper[4970]: I1128 13:35:42.120167 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"dfa2f2ae-c626-4fd8-a04c-a762e271a467","Type":"ContainerStarted","Data":"d34959bfbdbb8c13b83ef8e984b0cf24e5f22ab99a9594c089aeea056517a236"} Nov 28 13:35:42 crc kubenswrapper[4970]: I1128 13:35:42.120387 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:35:42 crc kubenswrapper[4970]: I1128 13:35:42.157838 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=38.140687948 podStartE2EDuration="49.157813057s" podCreationTimestamp="2025-11-28 13:34:53 +0000 UTC" firstStartedPulling="2025-11-28 13:34:54.985037661 +0000 UTC m=+905.837919491" lastFinishedPulling="2025-11-28 13:35:06.00216276 +0000 UTC m=+916.855044600" observedRunningTime="2025-11-28 13:35:42.155177434 +0000 UTC m=+953.008059254" watchObservedRunningTime="2025-11-28 13:35:42.157813057 +0000 UTC m=+953.010694877" Nov 28 13:35:48 crc kubenswrapper[4970]: I1128 13:35:48.182937 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" event={"ID":"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0","Type":"ContainerStarted","Data":"3ec6975cc83d8f66892a684775bf8fc026283124c64a2fef05829db449116928"} Nov 28 13:35:48 crc kubenswrapper[4970]: I1128 13:35:48.183570 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:35:48 crc kubenswrapper[4970]: I1128 13:35:48.210089 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" podStartSLOduration=1.68895742 podStartE2EDuration="7.210054501s" podCreationTimestamp="2025-11-28 13:35:41 +0000 UTC" firstStartedPulling="2025-11-28 13:35:42.077291066 +0000 UTC m=+952.930172866" lastFinishedPulling="2025-11-28 13:35:47.598388147 +0000 UTC m=+958.451269947" observedRunningTime="2025-11-28 13:35:48.201406899 +0000 UTC m=+959.054288699" watchObservedRunningTime="2025-11-28 13:35:48.210054501 +0000 UTC m=+959.062936341" Nov 28 13:35:51 crc kubenswrapper[4970]: I1128 13:35:51.333274 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:35:51 crc kubenswrapper[4970]: I1128 13:35:51.333731 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:35:51 crc kubenswrapper[4970]: I1128 13:35:51.333799 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:35:51 crc kubenswrapper[4970]: I1128 13:35:51.334857 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ad74f7ddfaa8d711be3a8043f5b9573ad4e845f67f91479eefe9466a3a483c3"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:35:51 crc kubenswrapper[4970]: I1128 13:35:51.334979 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://2ad74f7ddfaa8d711be3a8043f5b9573ad4e845f67f91479eefe9466a3a483c3" gracePeriod=600 Nov 28 13:35:52 crc kubenswrapper[4970]: I1128 13:35:52.220923 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="2ad74f7ddfaa8d711be3a8043f5b9573ad4e845f67f91479eefe9466a3a483c3" exitCode=0 Nov 28 13:35:52 crc kubenswrapper[4970]: I1128 13:35:52.221042 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"2ad74f7ddfaa8d711be3a8043f5b9573ad4e845f67f91479eefe9466a3a483c3"} Nov 28 13:35:52 crc kubenswrapper[4970]: I1128 13:35:52.221428 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"cd30743a39e211613c4a030816a122a40e3a3cc19bf445b31c5fe37b451ef30e"} Nov 28 13:35:52 crc kubenswrapper[4970]: I1128 13:35:52.221450 4970 scope.go:117] "RemoveContainer" containerID="86a03fe6c83c6ac3411e98ed1337717f0b27b46f31a13d39550e07889da6badd" Nov 28 13:35:54 crc kubenswrapper[4970]: I1128 13:35:54.450540 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:36:01 crc kubenswrapper[4970]: I1128 13:36:01.881760 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:36:03 crc kubenswrapper[4970]: I1128 13:36:03.986227 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl"] Nov 28 13:36:03 crc kubenswrapper[4970]: I1128 13:36:03.987203 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:03 crc kubenswrapper[4970]: I1128 13:36:03.990465 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:36:03 crc kubenswrapper[4970]: I1128 13:36:03.999827 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl"] Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.080296 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-v9896"] Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.081227 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.100760 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-v9896"] Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.131963 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bff0f605-69d2-4da7-bb6d-0ae425c12cec-operator-scripts\") pod \"keystone-1b71-account-create-update-z4zpl\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.132177 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wm7f\" (UniqueName: \"kubernetes.io/projected/bff0f605-69d2-4da7-bb6d-0ae425c12cec-kube-api-access-2wm7f\") pod \"keystone-1b71-account-create-update-z4zpl\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.233686 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bff0f605-69d2-4da7-bb6d-0ae425c12cec-operator-scripts\") pod \"keystone-1b71-account-create-update-z4zpl\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.233792 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bbc\" (UniqueName: \"kubernetes.io/projected/8b663e10-8cd0-4afe-affa-cc906aacacf9-kube-api-access-z4bbc\") pod \"keystone-db-create-v9896\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.233847 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wm7f\" (UniqueName: \"kubernetes.io/projected/bff0f605-69d2-4da7-bb6d-0ae425c12cec-kube-api-access-2wm7f\") pod \"keystone-1b71-account-create-update-z4zpl\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.233892 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b663e10-8cd0-4afe-affa-cc906aacacf9-operator-scripts\") pod \"keystone-db-create-v9896\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.234475 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bff0f605-69d2-4da7-bb6d-0ae425c12cec-operator-scripts\") pod \"keystone-1b71-account-create-update-z4zpl\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.263372 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wm7f\" (UniqueName: \"kubernetes.io/projected/bff0f605-69d2-4da7-bb6d-0ae425c12cec-kube-api-access-2wm7f\") pod \"keystone-1b71-account-create-update-z4zpl\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.326414 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.335592 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bbc\" (UniqueName: \"kubernetes.io/projected/8b663e10-8cd0-4afe-affa-cc906aacacf9-kube-api-access-z4bbc\") pod \"keystone-db-create-v9896\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.335697 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b663e10-8cd0-4afe-affa-cc906aacacf9-operator-scripts\") pod \"keystone-db-create-v9896\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.336939 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b663e10-8cd0-4afe-affa-cc906aacacf9-operator-scripts\") pod \"keystone-db-create-v9896\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.354871 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bbc\" (UniqueName: \"kubernetes.io/projected/8b663e10-8cd0-4afe-affa-cc906aacacf9-kube-api-access-z4bbc\") pod \"keystone-db-create-v9896\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.398661 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.741854 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-v9896"] Nov 28 13:36:04 crc kubenswrapper[4970]: W1128 13:36:04.745052 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b663e10_8cd0_4afe_affa_cc906aacacf9.slice/crio-51bdc4c5678bc4220063da4575851f6b4398ca1b6bcf3215a066d44a30a5cea6 WatchSource:0}: Error finding container 51bdc4c5678bc4220063da4575851f6b4398ca1b6bcf3215a066d44a30a5cea6: Status 404 returned error can't find the container with id 51bdc4c5678bc4220063da4575851f6b4398ca1b6bcf3215a066d44a30a5cea6 Nov 28 13:36:04 crc kubenswrapper[4970]: I1128 13:36:04.815600 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl"] Nov 28 13:36:04 crc kubenswrapper[4970]: W1128 13:36:04.827746 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbff0f605_69d2_4da7_bb6d_0ae425c12cec.slice/crio-88b568b084c88e2b921fd5299f1d071aece174e2edb1aa8c16e3840ff5556fa4 WatchSource:0}: Error finding container 88b568b084c88e2b921fd5299f1d071aece174e2edb1aa8c16e3840ff5556fa4: Status 404 returned error can't find the container with id 88b568b084c88e2b921fd5299f1d071aece174e2edb1aa8c16e3840ff5556fa4 Nov 28 13:36:05 crc kubenswrapper[4970]: I1128 13:36:05.328804 4970 generic.go:334] "Generic (PLEG): container finished" podID="8b663e10-8cd0-4afe-affa-cc906aacacf9" containerID="daf7dd6f5a9566952e5e4beb78265791a07df0c0a0f1cd3b9a019aed65d7c030" exitCode=0 Nov 28 13:36:05 crc kubenswrapper[4970]: I1128 13:36:05.328852 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-v9896" event={"ID":"8b663e10-8cd0-4afe-affa-cc906aacacf9","Type":"ContainerDied","Data":"daf7dd6f5a9566952e5e4beb78265791a07df0c0a0f1cd3b9a019aed65d7c030"} Nov 28 13:36:05 crc kubenswrapper[4970]: I1128 13:36:05.329243 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-v9896" event={"ID":"8b663e10-8cd0-4afe-affa-cc906aacacf9","Type":"ContainerStarted","Data":"51bdc4c5678bc4220063da4575851f6b4398ca1b6bcf3215a066d44a30a5cea6"} Nov 28 13:36:05 crc kubenswrapper[4970]: I1128 13:36:05.330972 4970 generic.go:334] "Generic (PLEG): container finished" podID="bff0f605-69d2-4da7-bb6d-0ae425c12cec" containerID="442ae51f9a087ea62429a9eb102060380b7990952f01041b58a082dcba885912" exitCode=0 Nov 28 13:36:05 crc kubenswrapper[4970]: I1128 13:36:05.331022 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" event={"ID":"bff0f605-69d2-4da7-bb6d-0ae425c12cec","Type":"ContainerDied","Data":"442ae51f9a087ea62429a9eb102060380b7990952f01041b58a082dcba885912"} Nov 28 13:36:05 crc kubenswrapper[4970]: I1128 13:36:05.331087 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" event={"ID":"bff0f605-69d2-4da7-bb6d-0ae425c12cec","Type":"ContainerStarted","Data":"88b568b084c88e2b921fd5299f1d071aece174e2edb1aa8c16e3840ff5556fa4"} Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.703450 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.761480 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.778632 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bff0f605-69d2-4da7-bb6d-0ae425c12cec-operator-scripts\") pod \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.778787 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wm7f\" (UniqueName: \"kubernetes.io/projected/bff0f605-69d2-4da7-bb6d-0ae425c12cec-kube-api-access-2wm7f\") pod \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\" (UID: \"bff0f605-69d2-4da7-bb6d-0ae425c12cec\") " Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.779501 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bff0f605-69d2-4da7-bb6d-0ae425c12cec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bff0f605-69d2-4da7-bb6d-0ae425c12cec" (UID: "bff0f605-69d2-4da7-bb6d-0ae425c12cec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.786002 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff0f605-69d2-4da7-bb6d-0ae425c12cec-kube-api-access-2wm7f" (OuterVolumeSpecName: "kube-api-access-2wm7f") pod "bff0f605-69d2-4da7-bb6d-0ae425c12cec" (UID: "bff0f605-69d2-4da7-bb6d-0ae425c12cec"). InnerVolumeSpecName "kube-api-access-2wm7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.879689 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bbc\" (UniqueName: \"kubernetes.io/projected/8b663e10-8cd0-4afe-affa-cc906aacacf9-kube-api-access-z4bbc\") pod \"8b663e10-8cd0-4afe-affa-cc906aacacf9\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.879820 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b663e10-8cd0-4afe-affa-cc906aacacf9-operator-scripts\") pod \"8b663e10-8cd0-4afe-affa-cc906aacacf9\" (UID: \"8b663e10-8cd0-4afe-affa-cc906aacacf9\") " Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.880094 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bff0f605-69d2-4da7-bb6d-0ae425c12cec-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.880106 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wm7f\" (UniqueName: \"kubernetes.io/projected/bff0f605-69d2-4da7-bb6d-0ae425c12cec-kube-api-access-2wm7f\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.881033 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b663e10-8cd0-4afe-affa-cc906aacacf9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b663e10-8cd0-4afe-affa-cc906aacacf9" (UID: "8b663e10-8cd0-4afe-affa-cc906aacacf9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.884749 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b663e10-8cd0-4afe-affa-cc906aacacf9-kube-api-access-z4bbc" (OuterVolumeSpecName: "kube-api-access-z4bbc") pod "8b663e10-8cd0-4afe-affa-cc906aacacf9" (UID: "8b663e10-8cd0-4afe-affa-cc906aacacf9"). InnerVolumeSpecName "kube-api-access-z4bbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.981428 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b663e10-8cd0-4afe-affa-cc906aacacf9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:06 crc kubenswrapper[4970]: I1128 13:36:06.981478 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bbc\" (UniqueName: \"kubernetes.io/projected/8b663e10-8cd0-4afe-affa-cc906aacacf9-kube-api-access-z4bbc\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:07 crc kubenswrapper[4970]: I1128 13:36:07.353954 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-v9896" event={"ID":"8b663e10-8cd0-4afe-affa-cc906aacacf9","Type":"ContainerDied","Data":"51bdc4c5678bc4220063da4575851f6b4398ca1b6bcf3215a066d44a30a5cea6"} Nov 28 13:36:07 crc kubenswrapper[4970]: I1128 13:36:07.354014 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51bdc4c5678bc4220063da4575851f6b4398ca1b6bcf3215a066d44a30a5cea6" Nov 28 13:36:07 crc kubenswrapper[4970]: I1128 13:36:07.354028 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-v9896" Nov 28 13:36:07 crc kubenswrapper[4970]: I1128 13:36:07.356557 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" event={"ID":"bff0f605-69d2-4da7-bb6d-0ae425c12cec","Type":"ContainerDied","Data":"88b568b084c88e2b921fd5299f1d071aece174e2edb1aa8c16e3840ff5556fa4"} Nov 28 13:36:07 crc kubenswrapper[4970]: I1128 13:36:07.356633 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b568b084c88e2b921fd5299f1d071aece174e2edb1aa8c16e3840ff5556fa4" Nov 28 13:36:07 crc kubenswrapper[4970]: I1128 13:36:07.356747 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.703639 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wwsk8"] Nov 28 13:36:09 crc kubenswrapper[4970]: E1128 13:36:09.704321 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff0f605-69d2-4da7-bb6d-0ae425c12cec" containerName="mariadb-account-create-update" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.704340 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff0f605-69d2-4da7-bb6d-0ae425c12cec" containerName="mariadb-account-create-update" Nov 28 13:36:09 crc kubenswrapper[4970]: E1128 13:36:09.704362 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b663e10-8cd0-4afe-affa-cc906aacacf9" containerName="mariadb-database-create" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.704371 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b663e10-8cd0-4afe-affa-cc906aacacf9" containerName="mariadb-database-create" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.704541 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b663e10-8cd0-4afe-affa-cc906aacacf9" containerName="mariadb-database-create" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.704815 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff0f605-69d2-4da7-bb6d-0ae425c12cec" containerName="mariadb-account-create-update" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.705500 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.717019 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.718707 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.719181 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-qlkj8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.727491 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.729030 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wwsk8"] Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.831079 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpf2d\" (UniqueName: \"kubernetes.io/projected/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-kube-api-access-vpf2d\") pod \"keystone-db-sync-wwsk8\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.831132 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-config-data\") pod \"keystone-db-sync-wwsk8\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.932710 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpf2d\" (UniqueName: \"kubernetes.io/projected/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-kube-api-access-vpf2d\") pod \"keystone-db-sync-wwsk8\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.932767 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-config-data\") pod \"keystone-db-sync-wwsk8\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.940065 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-config-data\") pod \"keystone-db-sync-wwsk8\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:09 crc kubenswrapper[4970]: I1128 13:36:09.948252 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpf2d\" (UniqueName: \"kubernetes.io/projected/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-kube-api-access-vpf2d\") pod \"keystone-db-sync-wwsk8\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:10 crc kubenswrapper[4970]: I1128 13:36:10.045376 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:10 crc kubenswrapper[4970]: I1128 13:36:10.367356 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wwsk8"] Nov 28 13:36:11 crc kubenswrapper[4970]: I1128 13:36:11.413820 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" event={"ID":"a9bbaadb-ea54-4b8e-b063-a8d8266e182a","Type":"ContainerStarted","Data":"106e22abcaed8756d0e8f100024f38bf1f9d08d504137b34612c75847ece071f"} Nov 28 13:36:23 crc kubenswrapper[4970]: E1128 13:36:23.798081 4970 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Nov 28 13:36:23 crc kubenswrapper[4970]: E1128 13:36:23.798796 4970 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpf2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-wwsk8_keystone-kuttl-tests(a9bbaadb-ea54-4b8e-b063-a8d8266e182a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 28 13:36:23 crc kubenswrapper[4970]: E1128 13:36:23.800005 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" podUID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" Nov 28 13:36:24 crc kubenswrapper[4970]: E1128 13:36:24.534340 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" podUID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" Nov 28 13:36:37 crc kubenswrapper[4970]: I1128 13:36:37.385095 4970 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:36:38 crc kubenswrapper[4970]: I1128 13:36:38.660629 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" event={"ID":"a9bbaadb-ea54-4b8e-b063-a8d8266e182a","Type":"ContainerStarted","Data":"b98972926a03a7c3de9c1bcabca73c5fd08a14299da372458a680f241a8dfeb1"} Nov 28 13:36:38 crc kubenswrapper[4970]: I1128 13:36:38.682889 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" podStartSLOduration=2.180438637 podStartE2EDuration="29.68286541s" podCreationTimestamp="2025-11-28 13:36:09 +0000 UTC" firstStartedPulling="2025-11-28 13:36:10.391012508 +0000 UTC m=+981.243894318" lastFinishedPulling="2025-11-28 13:36:37.893439251 +0000 UTC m=+1008.746321091" observedRunningTime="2025-11-28 13:36:38.677255181 +0000 UTC m=+1009.530136981" watchObservedRunningTime="2025-11-28 13:36:38.68286541 +0000 UTC m=+1009.535747240" Nov 28 13:36:41 crc kubenswrapper[4970]: I1128 13:36:41.685910 4970 generic.go:334] "Generic (PLEG): container finished" podID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" containerID="b98972926a03a7c3de9c1bcabca73c5fd08a14299da372458a680f241a8dfeb1" exitCode=0 Nov 28 13:36:41 crc kubenswrapper[4970]: I1128 13:36:41.686013 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" event={"ID":"a9bbaadb-ea54-4b8e-b063-a8d8266e182a","Type":"ContainerDied","Data":"b98972926a03a7c3de9c1bcabca73c5fd08a14299da372458a680f241a8dfeb1"} Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.004354 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.182019 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpf2d\" (UniqueName: \"kubernetes.io/projected/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-kube-api-access-vpf2d\") pod \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.182111 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-config-data\") pod \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\" (UID: \"a9bbaadb-ea54-4b8e-b063-a8d8266e182a\") " Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.189050 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-kube-api-access-vpf2d" (OuterVolumeSpecName: "kube-api-access-vpf2d") pod "a9bbaadb-ea54-4b8e-b063-a8d8266e182a" (UID: "a9bbaadb-ea54-4b8e-b063-a8d8266e182a"). InnerVolumeSpecName "kube-api-access-vpf2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.225825 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-config-data" (OuterVolumeSpecName: "config-data") pod "a9bbaadb-ea54-4b8e-b063-a8d8266e182a" (UID: "a9bbaadb-ea54-4b8e-b063-a8d8266e182a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.283876 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpf2d\" (UniqueName: \"kubernetes.io/projected/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-kube-api-access-vpf2d\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.283924 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bbaadb-ea54-4b8e-b063-a8d8266e182a-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.702929 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" event={"ID":"a9bbaadb-ea54-4b8e-b063-a8d8266e182a","Type":"ContainerDied","Data":"106e22abcaed8756d0e8f100024f38bf1f9d08d504137b34612c75847ece071f"} Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.702987 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="106e22abcaed8756d0e8f100024f38bf1f9d08d504137b34612c75847ece071f" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.702989 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wwsk8" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.938005 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-rmpks"] Nov 28 13:36:43 crc kubenswrapper[4970]: E1128 13:36:43.939471 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" containerName="keystone-db-sync" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.939500 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" containerName="keystone-db-sync" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.939678 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" containerName="keystone-db-sync" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.940434 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.944389 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.944518 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-qlkj8" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.945554 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.954558 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.954783 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.955454 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-rmpks"] Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.994660 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-scripts\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.994732 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-fernet-keys\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.994774 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-config-data\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.994814 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-credential-keys\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:43 crc kubenswrapper[4970]: I1128 13:36:43.994948 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpd6r\" (UniqueName: \"kubernetes.io/projected/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-kube-api-access-bpd6r\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.096346 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-fernet-keys\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.096425 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-config-data\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.096485 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-credential-keys\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.096516 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpd6r\" (UniqueName: \"kubernetes.io/projected/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-kube-api-access-bpd6r\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.096555 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-scripts\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.102834 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-scripts\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.103295 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-fernet-keys\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.103514 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-credential-keys\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.112052 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-config-data\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.125355 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpd6r\" (UniqueName: \"kubernetes.io/projected/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-kube-api-access-bpd6r\") pod \"keystone-bootstrap-rmpks\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.266794 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:44 crc kubenswrapper[4970]: I1128 13:36:44.750525 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-rmpks"] Nov 28 13:36:44 crc kubenswrapper[4970]: W1128 13:36:44.757018 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4148e34f_37f0_4af6_a375_aa67d8b1b2e6.slice/crio-d71ba084cb8753238af3cdc2cc657c474144653736e4d5dbf10765070f90e4e4 WatchSource:0}: Error finding container d71ba084cb8753238af3cdc2cc657c474144653736e4d5dbf10765070f90e4e4: Status 404 returned error can't find the container with id d71ba084cb8753238af3cdc2cc657c474144653736e4d5dbf10765070f90e4e4 Nov 28 13:36:45 crc kubenswrapper[4970]: I1128 13:36:45.720656 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" event={"ID":"4148e34f-37f0-4af6-a375-aa67d8b1b2e6","Type":"ContainerStarted","Data":"6f8435a264f23b7d579e74f5f19a57cba7471b2418c2a6478d30533bbd6f0cb3"} Nov 28 13:36:45 crc kubenswrapper[4970]: I1128 13:36:45.720959 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" event={"ID":"4148e34f-37f0-4af6-a375-aa67d8b1b2e6","Type":"ContainerStarted","Data":"d71ba084cb8753238af3cdc2cc657c474144653736e4d5dbf10765070f90e4e4"} Nov 28 13:36:45 crc kubenswrapper[4970]: I1128 13:36:45.742126 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" podStartSLOduration=2.742101529 podStartE2EDuration="2.742101529s" podCreationTimestamp="2025-11-28 13:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:36:45.73800535 +0000 UTC m=+1016.590887170" watchObservedRunningTime="2025-11-28 13:36:45.742101529 +0000 UTC m=+1016.594983349" Nov 28 13:36:48 crc kubenswrapper[4970]: I1128 13:36:48.747720 4970 generic.go:334] "Generic (PLEG): container finished" podID="4148e34f-37f0-4af6-a375-aa67d8b1b2e6" containerID="6f8435a264f23b7d579e74f5f19a57cba7471b2418c2a6478d30533bbd6f0cb3" exitCode=0 Nov 28 13:36:48 crc kubenswrapper[4970]: I1128 13:36:48.747812 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" event={"ID":"4148e34f-37f0-4af6-a375-aa67d8b1b2e6","Type":"ContainerDied","Data":"6f8435a264f23b7d579e74f5f19a57cba7471b2418c2a6478d30533bbd6f0cb3"} Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.085267 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.093673 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-fernet-keys\") pod \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.093772 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-config-data\") pod \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.093831 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-credential-keys\") pod \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.093964 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpd6r\" (UniqueName: \"kubernetes.io/projected/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-kube-api-access-bpd6r\") pod \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.094035 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-scripts\") pod \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\" (UID: \"4148e34f-37f0-4af6-a375-aa67d8b1b2e6\") " Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.099724 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-kube-api-access-bpd6r" (OuterVolumeSpecName: "kube-api-access-bpd6r") pod "4148e34f-37f0-4af6-a375-aa67d8b1b2e6" (UID: "4148e34f-37f0-4af6-a375-aa67d8b1b2e6"). InnerVolumeSpecName "kube-api-access-bpd6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.099721 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-scripts" (OuterVolumeSpecName: "scripts") pod "4148e34f-37f0-4af6-a375-aa67d8b1b2e6" (UID: "4148e34f-37f0-4af6-a375-aa67d8b1b2e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.100953 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4148e34f-37f0-4af6-a375-aa67d8b1b2e6" (UID: "4148e34f-37f0-4af6-a375-aa67d8b1b2e6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.102758 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4148e34f-37f0-4af6-a375-aa67d8b1b2e6" (UID: "4148e34f-37f0-4af6-a375-aa67d8b1b2e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.142679 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-config-data" (OuterVolumeSpecName: "config-data") pod "4148e34f-37f0-4af6-a375-aa67d8b1b2e6" (UID: "4148e34f-37f0-4af6-a375-aa67d8b1b2e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.196061 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.196120 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.196141 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpd6r\" (UniqueName: \"kubernetes.io/projected/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-kube-api-access-bpd6r\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.196162 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.196180 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4148e34f-37f0-4af6-a375-aa67d8b1b2e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.767027 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" event={"ID":"4148e34f-37f0-4af6-a375-aa67d8b1b2e6","Type":"ContainerDied","Data":"d71ba084cb8753238af3cdc2cc657c474144653736e4d5dbf10765070f90e4e4"} Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.767083 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71ba084cb8753238af3cdc2cc657c474144653736e4d5dbf10765070f90e4e4" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.767123 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-rmpks" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.880027 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-75b94fdfd-9979t"] Nov 28 13:36:50 crc kubenswrapper[4970]: E1128 13:36:50.880409 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4148e34f-37f0-4af6-a375-aa67d8b1b2e6" containerName="keystone-bootstrap" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.880429 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4148e34f-37f0-4af6-a375-aa67d8b1b2e6" containerName="keystone-bootstrap" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.880571 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4148e34f-37f0-4af6-a375-aa67d8b1b2e6" containerName="keystone-bootstrap" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.881048 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.883262 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.884246 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.884638 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-qlkj8" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.887339 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.894569 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-75b94fdfd-9979t"] Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.908086 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-scripts\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.908245 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6vh\" (UniqueName: \"kubernetes.io/projected/28b7df7c-b8de-4ac6-955e-80dcd84267e1-kube-api-access-5z6vh\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.908563 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-credential-keys\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.908626 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-config-data\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:50 crc kubenswrapper[4970]: I1128 13:36:50.908731 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-fernet-keys\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.009911 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-fernet-keys\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.010011 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-scripts\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.010054 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6vh\" (UniqueName: \"kubernetes.io/projected/28b7df7c-b8de-4ac6-955e-80dcd84267e1-kube-api-access-5z6vh\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.010091 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-credential-keys\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.010117 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-config-data\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.014833 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-scripts\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.015110 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-credential-keys\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.018485 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-fernet-keys\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.018773 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-config-data\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.036164 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6vh\" (UniqueName: \"kubernetes.io/projected/28b7df7c-b8de-4ac6-955e-80dcd84267e1-kube-api-access-5z6vh\") pod \"keystone-75b94fdfd-9979t\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.208330 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.721927 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-75b94fdfd-9979t"] Nov 28 13:36:51 crc kubenswrapper[4970]: W1128 13:36:51.740601 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28b7df7c_b8de_4ac6_955e_80dcd84267e1.slice/crio-92396b0403a1e915c8d8b90d1f3ac8eac5d6e9b9a571200f64e9f018650d507d WatchSource:0}: Error finding container 92396b0403a1e915c8d8b90d1f3ac8eac5d6e9b9a571200f64e9f018650d507d: Status 404 returned error can't find the container with id 92396b0403a1e915c8d8b90d1f3ac8eac5d6e9b9a571200f64e9f018650d507d Nov 28 13:36:51 crc kubenswrapper[4970]: I1128 13:36:51.776734 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" event={"ID":"28b7df7c-b8de-4ac6-955e-80dcd84267e1","Type":"ContainerStarted","Data":"92396b0403a1e915c8d8b90d1f3ac8eac5d6e9b9a571200f64e9f018650d507d"} Nov 28 13:36:53 crc kubenswrapper[4970]: I1128 13:36:53.794671 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" event={"ID":"28b7df7c-b8de-4ac6-955e-80dcd84267e1","Type":"ContainerStarted","Data":"8d21e62023d04163a1dc3228bdcf8813cc72a50bac07eb13bc708219ccd0d8f5"} Nov 28 13:36:53 crc kubenswrapper[4970]: I1128 13:36:53.795229 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:36:53 crc kubenswrapper[4970]: I1128 13:36:53.825630 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" podStartSLOduration=3.825607691 podStartE2EDuration="3.825607691s" podCreationTimestamp="2025-11-28 13:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:36:53.817700601 +0000 UTC m=+1024.670582461" watchObservedRunningTime="2025-11-28 13:36:53.825607691 +0000 UTC m=+1024.678489491" Nov 28 13:37:22 crc kubenswrapper[4970]: I1128 13:37:22.749560 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.028969 4970 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-75b94fdfd-9979t_28b7df7c-b8de-4ac6-955e-80dcd84267e1/keystone-api/0.log" line={} Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.363368 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7f8576dfd7-446ss"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.364643 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.385842 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7f8576dfd7-446ss"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.498450 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.498532 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.498566 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c72g\" (UniqueName: \"kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.498620 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.498735 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.600037 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.600105 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.600144 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c72g\" (UniqueName: \"kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.600191 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.600255 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600295 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600336 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600406 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:25.100372828 +0000 UTC m=+1055.953254668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600426 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600445 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:25.1004277 +0000 UTC m=+1055.953309630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600519 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:25.100493012 +0000 UTC m=+1055.953374922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone-scripts" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600804 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.600853 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:25.100836471 +0000 UTC m=+1055.953718371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone-config-data" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.613594 4970 projected.go:194] Error preparing data for projected volume kube-api-access-4c72g for pod keystone-kuttl-tests/keystone-7f8576dfd7-446ss: failed to fetch token: serviceaccounts "keystone-keystone" not found Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.613730 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:25.113695372 +0000 UTC m=+1055.966577212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4c72g" (UniqueName: "kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : failed to fetch token: serviceaccounts "keystone-keystone" not found Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.618035 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wwsk8"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.626380 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wwsk8"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.631015 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-rmpks"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.635323 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-rmpks"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.650475 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-75b94fdfd-9979t"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.650697 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" podUID="28b7df7c-b8de-4ac6-955e-80dcd84267e1" containerName="keystone-api" containerID="cri-o://8d21e62023d04163a1dc3228bdcf8813cc72a50bac07eb13bc708219ccd0d8f5" gracePeriod=30 Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.660670 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7f8576dfd7-446ss"] Nov 28 13:37:24 crc kubenswrapper[4970]: E1128 13:37:24.661162 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data credential-keys fernet-keys kube-api-access-4c72g scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" podUID="bc366f2b-24d0-4041-b006-33721fb5b178" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.716426 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone1b71-account-delete-n5c2n"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.717654 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.723891 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone1b71-account-delete-n5c2n"] Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.802778 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vdx\" (UniqueName: \"kubernetes.io/projected/d30600c3-b4f9-413e-aeb9-f9df38e302c2-kube-api-access-72vdx\") pod \"keystone1b71-account-delete-n5c2n\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.802841 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30600c3-b4f9-413e-aeb9-f9df38e302c2-operator-scripts\") pod \"keystone1b71-account-delete-n5c2n\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.904035 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72vdx\" (UniqueName: \"kubernetes.io/projected/d30600c3-b4f9-413e-aeb9-f9df38e302c2-kube-api-access-72vdx\") pod \"keystone1b71-account-delete-n5c2n\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.904096 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30600c3-b4f9-413e-aeb9-f9df38e302c2-operator-scripts\") pod \"keystone1b71-account-delete-n5c2n\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.905316 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30600c3-b4f9-413e-aeb9-f9df38e302c2-operator-scripts\") pod \"keystone1b71-account-delete-n5c2n\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:24 crc kubenswrapper[4970]: I1128 13:37:24.931439 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vdx\" (UniqueName: \"kubernetes.io/projected/d30600c3-b4f9-413e-aeb9-f9df38e302c2-kube-api-access-72vdx\") pod \"keystone1b71-account-delete-n5c2n\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.039792 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.063007 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.114195 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.114399 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.114441 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.114475 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c72g\" (UniqueName: \"kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.114520 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.114706 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.114778 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:26.114758616 +0000 UTC m=+1056.967640416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone-scripts" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.115354 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.115405 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:26.115393142 +0000 UTC m=+1056.968274942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone-config-data" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.115582 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.115657 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:26.115644789 +0000 UTC m=+1056.968526589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.115723 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.115759 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:26.115749102 +0000 UTC m=+1056.968630902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.120013 4970 projected.go:194] Error preparing data for projected volume kube-api-access-4c72g for pod keystone-kuttl-tests/keystone-7f8576dfd7-446ss: failed to fetch token: serviceaccounts "keystone-keystone" not found Nov 28 13:37:25 crc kubenswrapper[4970]: E1128 13:37:25.120122 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:26.120084377 +0000 UTC m=+1056.972966187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4c72g" (UniqueName: "kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : failed to fetch token: serviceaccounts "keystone-keystone" not found Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.150339 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.310428 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone1b71-account-delete-n5c2n"] Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.389530 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4148e34f-37f0-4af6-a375-aa67d8b1b2e6" path="/var/lib/kubelet/pods/4148e34f-37f0-4af6-a375-aa67d8b1b2e6/volumes" Nov 28 13:37:25 crc kubenswrapper[4970]: I1128 13:37:25.390274 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9bbaadb-ea54-4b8e-b063-a8d8266e182a" path="/var/lib/kubelet/pods/a9bbaadb-ea54-4b8e-b063-a8d8266e182a/volumes" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.074689 4970 generic.go:334] "Generic (PLEG): container finished" podID="d30600c3-b4f9-413e-aeb9-f9df38e302c2" containerID="5dbcba2b210b3de5accd5a01a27cf55d229ed1171df5f836eda2c2bf5325fd1e" exitCode=0 Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.074787 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.075687 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" event={"ID":"d30600c3-b4f9-413e-aeb9-f9df38e302c2","Type":"ContainerDied","Data":"5dbcba2b210b3de5accd5a01a27cf55d229ed1171df5f836eda2c2bf5325fd1e"} Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.075749 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" event={"ID":"d30600c3-b4f9-413e-aeb9-f9df38e302c2","Type":"ContainerStarted","Data":"44a65a87343d4c24cdff1e820724b16d9a81c29f23cfd6eec76f2b88b43ef85f"} Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.130412 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.130477 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.130515 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c72g\" (UniqueName: \"kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.130565 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.130599 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data\") pod \"keystone-7f8576dfd7-446ss\" (UID: \"bc366f2b-24d0-4041-b006-33721fb5b178\") " pod="keystone-kuttl-tests/keystone-7f8576dfd7-446ss" Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130691 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130810 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:28.130783991 +0000 UTC m=+1058.983665791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130835 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130892 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:28.130873353 +0000 UTC m=+1058.983755163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone-config-data" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130944 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130971 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:28.130962286 +0000 UTC m=+1058.983844096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone-scripts" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.130691 4970 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.131007 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:28.130999517 +0000 UTC m=+1058.983881337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : secret "keystone" not found Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.131026 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7f8576dfd7-446ss"] Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.134326 4970 projected.go:194] Error preparing data for projected volume kube-api-access-4c72g for pod keystone-kuttl-tests/keystone-7f8576dfd7-446ss: failed to fetch token: pod "keystone-7f8576dfd7-446ss" not found Nov 28 13:37:26 crc kubenswrapper[4970]: E1128 13:37:26.134380 4970 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g podName:bc366f2b-24d0-4041-b006-33721fb5b178 nodeName:}" failed. No retries permitted until 2025-11-28 13:37:28.134364076 +0000 UTC m=+1058.987245886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4c72g" (UniqueName: "kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g") pod "keystone-7f8576dfd7-446ss" (UID: "bc366f2b-24d0-4041-b006-33721fb5b178") : failed to fetch token: pod "keystone-7f8576dfd7-446ss" not found Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.135732 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7f8576dfd7-446ss"] Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.232071 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.232113 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.232127 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.232139 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc366f2b-24d0-4041-b006-33721fb5b178-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:26 crc kubenswrapper[4970]: I1128 13:37:26.232151 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c72g\" (UniqueName: \"kubernetes.io/projected/bc366f2b-24d0-4041-b006-33721fb5b178-kube-api-access-4c72g\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.389784 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc366f2b-24d0-4041-b006-33721fb5b178" path="/var/lib/kubelet/pods/bc366f2b-24d0-4041-b006-33721fb5b178/volumes" Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.435816 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.554280 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72vdx\" (UniqueName: \"kubernetes.io/projected/d30600c3-b4f9-413e-aeb9-f9df38e302c2-kube-api-access-72vdx\") pod \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.554573 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30600c3-b4f9-413e-aeb9-f9df38e302c2-operator-scripts\") pod \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\" (UID: \"d30600c3-b4f9-413e-aeb9-f9df38e302c2\") " Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.556111 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30600c3-b4f9-413e-aeb9-f9df38e302c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d30600c3-b4f9-413e-aeb9-f9df38e302c2" (UID: "d30600c3-b4f9-413e-aeb9-f9df38e302c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.563478 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30600c3-b4f9-413e-aeb9-f9df38e302c2-kube-api-access-72vdx" (OuterVolumeSpecName: "kube-api-access-72vdx") pod "d30600c3-b4f9-413e-aeb9-f9df38e302c2" (UID: "d30600c3-b4f9-413e-aeb9-f9df38e302c2"). InnerVolumeSpecName "kube-api-access-72vdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.656894 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d30600c3-b4f9-413e-aeb9-f9df38e302c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:27 crc kubenswrapper[4970]: I1128 13:37:27.657208 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72vdx\" (UniqueName: \"kubernetes.io/projected/d30600c3-b4f9-413e-aeb9-f9df38e302c2-kube-api-access-72vdx\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:28 crc kubenswrapper[4970]: I1128 13:37:28.091054 4970 generic.go:334] "Generic (PLEG): container finished" podID="28b7df7c-b8de-4ac6-955e-80dcd84267e1" containerID="8d21e62023d04163a1dc3228bdcf8813cc72a50bac07eb13bc708219ccd0d8f5" exitCode=0 Nov 28 13:37:28 crc kubenswrapper[4970]: I1128 13:37:28.091124 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" event={"ID":"28b7df7c-b8de-4ac6-955e-80dcd84267e1","Type":"ContainerDied","Data":"8d21e62023d04163a1dc3228bdcf8813cc72a50bac07eb13bc708219ccd0d8f5"} Nov 28 13:37:28 crc kubenswrapper[4970]: I1128 13:37:28.092869 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" event={"ID":"d30600c3-b4f9-413e-aeb9-f9df38e302c2","Type":"ContainerDied","Data":"44a65a87343d4c24cdff1e820724b16d9a81c29f23cfd6eec76f2b88b43ef85f"} Nov 28 13:37:28 crc kubenswrapper[4970]: I1128 13:37:28.092910 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a65a87343d4c24cdff1e820724b16d9a81c29f23cfd6eec76f2b88b43ef85f" Nov 28 13:37:28 crc kubenswrapper[4970]: I1128 13:37:28.092975 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1b71-account-delete-n5c2n" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.103150 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" event={"ID":"28b7df7c-b8de-4ac6-955e-80dcd84267e1","Type":"ContainerDied","Data":"92396b0403a1e915c8d8b90d1f3ac8eac5d6e9b9a571200f64e9f018650d507d"} Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.103429 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92396b0403a1e915c8d8b90d1f3ac8eac5d6e9b9a571200f64e9f018650d507d" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.104123 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.179078 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-scripts\") pod \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.179114 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-config-data\") pod \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.179169 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-fernet-keys\") pod \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.179204 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-credential-keys\") pod \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.179279 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z6vh\" (UniqueName: \"kubernetes.io/projected/28b7df7c-b8de-4ac6-955e-80dcd84267e1-kube-api-access-5z6vh\") pod \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\" (UID: \"28b7df7c-b8de-4ac6-955e-80dcd84267e1\") " Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.183320 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b7df7c-b8de-4ac6-955e-80dcd84267e1-kube-api-access-5z6vh" (OuterVolumeSpecName: "kube-api-access-5z6vh") pod "28b7df7c-b8de-4ac6-955e-80dcd84267e1" (UID: "28b7df7c-b8de-4ac6-955e-80dcd84267e1"). InnerVolumeSpecName "kube-api-access-5z6vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.184299 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-scripts" (OuterVolumeSpecName: "scripts") pod "28b7df7c-b8de-4ac6-955e-80dcd84267e1" (UID: "28b7df7c-b8de-4ac6-955e-80dcd84267e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.184336 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "28b7df7c-b8de-4ac6-955e-80dcd84267e1" (UID: "28b7df7c-b8de-4ac6-955e-80dcd84267e1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.184596 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "28b7df7c-b8de-4ac6-955e-80dcd84267e1" (UID: "28b7df7c-b8de-4ac6-955e-80dcd84267e1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.199717 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-config-data" (OuterVolumeSpecName: "config-data") pod "28b7df7c-b8de-4ac6-955e-80dcd84267e1" (UID: "28b7df7c-b8de-4ac6-955e-80dcd84267e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.281712 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.281782 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.281810 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.281835 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28b7df7c-b8de-4ac6-955e-80dcd84267e1-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.281863 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z6vh\" (UniqueName: \"kubernetes.io/projected/28b7df7c-b8de-4ac6-955e-80dcd84267e1-kube-api-access-5z6vh\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.752739 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-v9896"] Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.766146 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-v9896"] Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.774014 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl"] Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.779640 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone1b71-account-delete-n5c2n"] Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.785257 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-1b71-account-create-update-z4zpl"] Nov 28 13:37:29 crc kubenswrapper[4970]: I1128 13:37:29.790303 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone1b71-account-delete-n5c2n"] Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.114763 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-75b94fdfd-9979t" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.148630 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-75b94fdfd-9979t"] Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.157647 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-75b94fdfd-9979t"] Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.984273 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6hvg9"] Nov 28 13:37:30 crc kubenswrapper[4970]: E1128 13:37:30.985000 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b7df7c-b8de-4ac6-955e-80dcd84267e1" containerName="keystone-api" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.985024 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b7df7c-b8de-4ac6-955e-80dcd84267e1" containerName="keystone-api" Nov 28 13:37:30 crc kubenswrapper[4970]: E1128 13:37:30.985049 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30600c3-b4f9-413e-aeb9-f9df38e302c2" containerName="mariadb-account-delete" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.985058 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30600c3-b4f9-413e-aeb9-f9df38e302c2" containerName="mariadb-account-delete" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.985271 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30600c3-b4f9-413e-aeb9-f9df38e302c2" containerName="mariadb-account-delete" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.985293 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b7df7c-b8de-4ac6-955e-80dcd84267e1" containerName="keystone-api" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.985980 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:30 crc kubenswrapper[4970]: I1128 13:37:30.994924 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6hvg9"] Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.004033 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl"] Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.005275 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.008273 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.015484 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl"] Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.108083 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aec0740-f76c-415e-927f-484e717f3aa1-operator-scripts\") pod \"keystone-c65e-account-create-update-28mpl\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.108159 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccvwk\" (UniqueName: \"kubernetes.io/projected/fd27cede-0acf-40e3-b456-12c99ba203ae-kube-api-access-ccvwk\") pod \"keystone-db-create-6hvg9\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.108207 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hm7j\" (UniqueName: \"kubernetes.io/projected/6aec0740-f76c-415e-927f-484e717f3aa1-kube-api-access-2hm7j\") pod \"keystone-c65e-account-create-update-28mpl\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.108266 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd27cede-0acf-40e3-b456-12c99ba203ae-operator-scripts\") pod \"keystone-db-create-6hvg9\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.210086 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd27cede-0acf-40e3-b456-12c99ba203ae-operator-scripts\") pod \"keystone-db-create-6hvg9\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.210193 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aec0740-f76c-415e-927f-484e717f3aa1-operator-scripts\") pod \"keystone-c65e-account-create-update-28mpl\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.210307 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccvwk\" (UniqueName: \"kubernetes.io/projected/fd27cede-0acf-40e3-b456-12c99ba203ae-kube-api-access-ccvwk\") pod \"keystone-db-create-6hvg9\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.210464 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hm7j\" (UniqueName: \"kubernetes.io/projected/6aec0740-f76c-415e-927f-484e717f3aa1-kube-api-access-2hm7j\") pod \"keystone-c65e-account-create-update-28mpl\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.211397 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aec0740-f76c-415e-927f-484e717f3aa1-operator-scripts\") pod \"keystone-c65e-account-create-update-28mpl\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.211692 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd27cede-0acf-40e3-b456-12c99ba203ae-operator-scripts\") pod \"keystone-db-create-6hvg9\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.229897 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hm7j\" (UniqueName: \"kubernetes.io/projected/6aec0740-f76c-415e-927f-484e717f3aa1-kube-api-access-2hm7j\") pod \"keystone-c65e-account-create-update-28mpl\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.241889 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccvwk\" (UniqueName: \"kubernetes.io/projected/fd27cede-0acf-40e3-b456-12c99ba203ae-kube-api-access-ccvwk\") pod \"keystone-db-create-6hvg9\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.303570 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.321671 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.389547 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b7df7c-b8de-4ac6-955e-80dcd84267e1" path="/var/lib/kubelet/pods/28b7df7c-b8de-4ac6-955e-80dcd84267e1/volumes" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.390016 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b663e10-8cd0-4afe-affa-cc906aacacf9" path="/var/lib/kubelet/pods/8b663e10-8cd0-4afe-affa-cc906aacacf9/volumes" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.390560 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff0f605-69d2-4da7-bb6d-0ae425c12cec" path="/var/lib/kubelet/pods/bff0f605-69d2-4da7-bb6d-0ae425c12cec/volumes" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.391038 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30600c3-b4f9-413e-aeb9-f9df38e302c2" path="/var/lib/kubelet/pods/d30600c3-b4f9-413e-aeb9-f9df38e302c2/volumes" Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.520452 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6hvg9"] Nov 28 13:37:31 crc kubenswrapper[4970]: W1128 13:37:31.530289 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd27cede_0acf_40e3_b456_12c99ba203ae.slice/crio-0a91487751cf8518c1223f63cdedb85f64cbc982dcc962cf4485d27ea3a8e6d5 WatchSource:0}: Error finding container 0a91487751cf8518c1223f63cdedb85f64cbc982dcc962cf4485d27ea3a8e6d5: Status 404 returned error can't find the container with id 0a91487751cf8518c1223f63cdedb85f64cbc982dcc962cf4485d27ea3a8e6d5 Nov 28 13:37:31 crc kubenswrapper[4970]: I1128 13:37:31.562717 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl"] Nov 28 13:37:31 crc kubenswrapper[4970]: W1128 13:37:31.565882 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aec0740_f76c_415e_927f_484e717f3aa1.slice/crio-5c292e9720c878616d5825e03702e8889ec6b276ae7e08c4487b4aeec1bb7a8c WatchSource:0}: Error finding container 5c292e9720c878616d5825e03702e8889ec6b276ae7e08c4487b4aeec1bb7a8c: Status 404 returned error can't find the container with id 5c292e9720c878616d5825e03702e8889ec6b276ae7e08c4487b4aeec1bb7a8c Nov 28 13:37:32 crc kubenswrapper[4970]: I1128 13:37:32.134741 4970 generic.go:334] "Generic (PLEG): container finished" podID="fd27cede-0acf-40e3-b456-12c99ba203ae" containerID="6701c5716a5247433b429d6f5968196a7fa1e126e588af748466c746cb7cc161" exitCode=0 Nov 28 13:37:32 crc kubenswrapper[4970]: I1128 13:37:32.135017 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" event={"ID":"fd27cede-0acf-40e3-b456-12c99ba203ae","Type":"ContainerDied","Data":"6701c5716a5247433b429d6f5968196a7fa1e126e588af748466c746cb7cc161"} Nov 28 13:37:32 crc kubenswrapper[4970]: I1128 13:37:32.135083 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" event={"ID":"fd27cede-0acf-40e3-b456-12c99ba203ae","Type":"ContainerStarted","Data":"0a91487751cf8518c1223f63cdedb85f64cbc982dcc962cf4485d27ea3a8e6d5"} Nov 28 13:37:32 crc kubenswrapper[4970]: I1128 13:37:32.140541 4970 generic.go:334] "Generic (PLEG): container finished" podID="6aec0740-f76c-415e-927f-484e717f3aa1" containerID="212632bd085249a884fbb45055201c25427f0ae5c9635a47c2d8655d2e01588f" exitCode=0 Nov 28 13:37:32 crc kubenswrapper[4970]: I1128 13:37:32.140607 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" event={"ID":"6aec0740-f76c-415e-927f-484e717f3aa1","Type":"ContainerDied","Data":"212632bd085249a884fbb45055201c25427f0ae5c9635a47c2d8655d2e01588f"} Nov 28 13:37:32 crc kubenswrapper[4970]: I1128 13:37:32.140651 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" event={"ID":"6aec0740-f76c-415e-927f-484e717f3aa1","Type":"ContainerStarted","Data":"5c292e9720c878616d5825e03702e8889ec6b276ae7e08c4487b4aeec1bb7a8c"} Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.512763 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.518499 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.550702 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aec0740-f76c-415e-927f-484e717f3aa1-operator-scripts\") pod \"6aec0740-f76c-415e-927f-484e717f3aa1\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.550843 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hm7j\" (UniqueName: \"kubernetes.io/projected/6aec0740-f76c-415e-927f-484e717f3aa1-kube-api-access-2hm7j\") pod \"6aec0740-f76c-415e-927f-484e717f3aa1\" (UID: \"6aec0740-f76c-415e-927f-484e717f3aa1\") " Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.552660 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aec0740-f76c-415e-927f-484e717f3aa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6aec0740-f76c-415e-927f-484e717f3aa1" (UID: "6aec0740-f76c-415e-927f-484e717f3aa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.556663 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aec0740-f76c-415e-927f-484e717f3aa1-kube-api-access-2hm7j" (OuterVolumeSpecName: "kube-api-access-2hm7j") pod "6aec0740-f76c-415e-927f-484e717f3aa1" (UID: "6aec0740-f76c-415e-927f-484e717f3aa1"). InnerVolumeSpecName "kube-api-access-2hm7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.652718 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd27cede-0acf-40e3-b456-12c99ba203ae-operator-scripts\") pod \"fd27cede-0acf-40e3-b456-12c99ba203ae\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.652811 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccvwk\" (UniqueName: \"kubernetes.io/projected/fd27cede-0acf-40e3-b456-12c99ba203ae-kube-api-access-ccvwk\") pod \"fd27cede-0acf-40e3-b456-12c99ba203ae\" (UID: \"fd27cede-0acf-40e3-b456-12c99ba203ae\") " Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.653288 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hm7j\" (UniqueName: \"kubernetes.io/projected/6aec0740-f76c-415e-927f-484e717f3aa1-kube-api-access-2hm7j\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.653320 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aec0740-f76c-415e-927f-484e717f3aa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.653893 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd27cede-0acf-40e3-b456-12c99ba203ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd27cede-0acf-40e3-b456-12c99ba203ae" (UID: "fd27cede-0acf-40e3-b456-12c99ba203ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.656996 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd27cede-0acf-40e3-b456-12c99ba203ae-kube-api-access-ccvwk" (OuterVolumeSpecName: "kube-api-access-ccvwk") pod "fd27cede-0acf-40e3-b456-12c99ba203ae" (UID: "fd27cede-0acf-40e3-b456-12c99ba203ae"). InnerVolumeSpecName "kube-api-access-ccvwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.754528 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd27cede-0acf-40e3-b456-12c99ba203ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:33 crc kubenswrapper[4970]: I1128 13:37:33.754574 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccvwk\" (UniqueName: \"kubernetes.io/projected/fd27cede-0acf-40e3-b456-12c99ba203ae-kube-api-access-ccvwk\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:34 crc kubenswrapper[4970]: I1128 13:37:34.158449 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" event={"ID":"6aec0740-f76c-415e-927f-484e717f3aa1","Type":"ContainerDied","Data":"5c292e9720c878616d5825e03702e8889ec6b276ae7e08c4487b4aeec1bb7a8c"} Nov 28 13:37:34 crc kubenswrapper[4970]: I1128 13:37:34.158781 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c292e9720c878616d5825e03702e8889ec6b276ae7e08c4487b4aeec1bb7a8c" Nov 28 13:37:34 crc kubenswrapper[4970]: I1128 13:37:34.158486 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl" Nov 28 13:37:34 crc kubenswrapper[4970]: I1128 13:37:34.160330 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" event={"ID":"fd27cede-0acf-40e3-b456-12c99ba203ae","Type":"ContainerDied","Data":"0a91487751cf8518c1223f63cdedb85f64cbc982dcc962cf4485d27ea3a8e6d5"} Nov 28 13:37:34 crc kubenswrapper[4970]: I1128 13:37:34.160373 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a91487751cf8518c1223f63cdedb85f64cbc982dcc962cf4485d27ea3a8e6d5" Nov 28 13:37:34 crc kubenswrapper[4970]: I1128 13:37:34.160386 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6hvg9" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.567655 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d7h7j"] Nov 28 13:37:36 crc kubenswrapper[4970]: E1128 13:37:36.568440 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aec0740-f76c-415e-927f-484e717f3aa1" containerName="mariadb-account-create-update" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.568462 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aec0740-f76c-415e-927f-484e717f3aa1" containerName="mariadb-account-create-update" Nov 28 13:37:36 crc kubenswrapper[4970]: E1128 13:37:36.568492 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd27cede-0acf-40e3-b456-12c99ba203ae" containerName="mariadb-database-create" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.568504 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd27cede-0acf-40e3-b456-12c99ba203ae" containerName="mariadb-database-create" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.568684 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd27cede-0acf-40e3-b456-12c99ba203ae" containerName="mariadb-database-create" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.568707 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aec0740-f76c-415e-927f-484e717f3aa1" containerName="mariadb-account-create-update" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.569427 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.573776 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.573792 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.573836 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-ztgrb" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.574052 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.591068 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d7h7j"] Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.722118 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szjxp\" (UniqueName: \"kubernetes.io/projected/fbd150ac-9406-4874-beb0-1de8b4501c1b-kube-api-access-szjxp\") pod \"keystone-db-sync-d7h7j\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.722176 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd150ac-9406-4874-beb0-1de8b4501c1b-config-data\") pod \"keystone-db-sync-d7h7j\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.823883 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szjxp\" (UniqueName: \"kubernetes.io/projected/fbd150ac-9406-4874-beb0-1de8b4501c1b-kube-api-access-szjxp\") pod \"keystone-db-sync-d7h7j\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.823983 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd150ac-9406-4874-beb0-1de8b4501c1b-config-data\") pod \"keystone-db-sync-d7h7j\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.842668 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd150ac-9406-4874-beb0-1de8b4501c1b-config-data\") pod \"keystone-db-sync-d7h7j\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.857718 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szjxp\" (UniqueName: \"kubernetes.io/projected/fbd150ac-9406-4874-beb0-1de8b4501c1b-kube-api-access-szjxp\") pod \"keystone-db-sync-d7h7j\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:36 crc kubenswrapper[4970]: I1128 13:37:36.927601 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:37 crc kubenswrapper[4970]: I1128 13:37:37.252017 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d7h7j"] Nov 28 13:37:38 crc kubenswrapper[4970]: I1128 13:37:38.195790 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" event={"ID":"fbd150ac-9406-4874-beb0-1de8b4501c1b","Type":"ContainerStarted","Data":"1ac927e454374093b3f0d8e849e7f1cb74e24e63da4ecb000b30668ead8997d4"} Nov 28 13:37:38 crc kubenswrapper[4970]: I1128 13:37:38.196081 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" event={"ID":"fbd150ac-9406-4874-beb0-1de8b4501c1b","Type":"ContainerStarted","Data":"1cdf06f24d0853f3e7529864b2d23a9dbd851ff32277d80b813ef97fb456b92d"} Nov 28 13:37:38 crc kubenswrapper[4970]: I1128 13:37:38.224712 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" podStartSLOduration=2.224689865 podStartE2EDuration="2.224689865s" podCreationTimestamp="2025-11-28 13:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:37:38.21959875 +0000 UTC m=+1069.072480550" watchObservedRunningTime="2025-11-28 13:37:38.224689865 +0000 UTC m=+1069.077571685" Nov 28 13:37:39 crc kubenswrapper[4970]: I1128 13:37:39.203099 4970 generic.go:334] "Generic (PLEG): container finished" podID="fbd150ac-9406-4874-beb0-1de8b4501c1b" containerID="1ac927e454374093b3f0d8e849e7f1cb74e24e63da4ecb000b30668ead8997d4" exitCode=0 Nov 28 13:37:39 crc kubenswrapper[4970]: I1128 13:37:39.203295 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" event={"ID":"fbd150ac-9406-4874-beb0-1de8b4501c1b","Type":"ContainerDied","Data":"1ac927e454374093b3f0d8e849e7f1cb74e24e63da4ecb000b30668ead8997d4"} Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.554033 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.684861 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd150ac-9406-4874-beb0-1de8b4501c1b-config-data\") pod \"fbd150ac-9406-4874-beb0-1de8b4501c1b\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.684975 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szjxp\" (UniqueName: \"kubernetes.io/projected/fbd150ac-9406-4874-beb0-1de8b4501c1b-kube-api-access-szjxp\") pod \"fbd150ac-9406-4874-beb0-1de8b4501c1b\" (UID: \"fbd150ac-9406-4874-beb0-1de8b4501c1b\") " Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.703751 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd150ac-9406-4874-beb0-1de8b4501c1b-kube-api-access-szjxp" (OuterVolumeSpecName: "kube-api-access-szjxp") pod "fbd150ac-9406-4874-beb0-1de8b4501c1b" (UID: "fbd150ac-9406-4874-beb0-1de8b4501c1b"). InnerVolumeSpecName "kube-api-access-szjxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.732983 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd150ac-9406-4874-beb0-1de8b4501c1b-config-data" (OuterVolumeSpecName: "config-data") pod "fbd150ac-9406-4874-beb0-1de8b4501c1b" (UID: "fbd150ac-9406-4874-beb0-1de8b4501c1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.788128 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd150ac-9406-4874-beb0-1de8b4501c1b-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:40 crc kubenswrapper[4970]: I1128 13:37:40.788192 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szjxp\" (UniqueName: \"kubernetes.io/projected/fbd150ac-9406-4874-beb0-1de8b4501c1b-kube-api-access-szjxp\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.220019 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" event={"ID":"fbd150ac-9406-4874-beb0-1de8b4501c1b","Type":"ContainerDied","Data":"1cdf06f24d0853f3e7529864b2d23a9dbd851ff32277d80b813ef97fb456b92d"} Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.220061 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cdf06f24d0853f3e7529864b2d23a9dbd851ff32277d80b813ef97fb456b92d" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.220130 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-d7h7j" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.426927 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-z6qgl"] Nov 28 13:37:41 crc kubenswrapper[4970]: E1128 13:37:41.427604 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd150ac-9406-4874-beb0-1de8b4501c1b" containerName="keystone-db-sync" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.427625 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd150ac-9406-4874-beb0-1de8b4501c1b" containerName="keystone-db-sync" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.427781 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd150ac-9406-4874-beb0-1de8b4501c1b" containerName="keystone-db-sync" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.428374 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.433370 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.433616 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.433634 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.433669 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.433809 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-ztgrb" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.446635 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-z6qgl"] Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.497321 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-credential-keys\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.497368 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnfv4\" (UniqueName: \"kubernetes.io/projected/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-kube-api-access-qnfv4\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.497391 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-scripts\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.497487 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-fernet-keys\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.497769 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-config-data\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.598903 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-config-data\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.599024 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-credential-keys\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.599072 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnfv4\" (UniqueName: \"kubernetes.io/projected/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-kube-api-access-qnfv4\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.599116 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-scripts\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.599188 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-fernet-keys\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.604726 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-config-data\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.604948 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-credential-keys\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.606837 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-scripts\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.608849 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-fernet-keys\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.613856 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnfv4\" (UniqueName: \"kubernetes.io/projected/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-kube-api-access-qnfv4\") pod \"keystone-bootstrap-z6qgl\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:41 crc kubenswrapper[4970]: I1128 13:37:41.748273 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:42 crc kubenswrapper[4970]: I1128 13:37:42.164366 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-z6qgl"] Nov 28 13:37:42 crc kubenswrapper[4970]: W1128 13:37:42.171373 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dce7ed1_5bf7_4d47_b9d5_00a9b4cfb822.slice/crio-de1f5508cd9ae6268e813afb8014a43db8db733b176d97536dc3b47cc1ba74f2 WatchSource:0}: Error finding container de1f5508cd9ae6268e813afb8014a43db8db733b176d97536dc3b47cc1ba74f2: Status 404 returned error can't find the container with id de1f5508cd9ae6268e813afb8014a43db8db733b176d97536dc3b47cc1ba74f2 Nov 28 13:37:42 crc kubenswrapper[4970]: I1128 13:37:42.225911 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" event={"ID":"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822","Type":"ContainerStarted","Data":"de1f5508cd9ae6268e813afb8014a43db8db733b176d97536dc3b47cc1ba74f2"} Nov 28 13:37:43 crc kubenswrapper[4970]: I1128 13:37:43.236343 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" event={"ID":"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822","Type":"ContainerStarted","Data":"646cbfa6b7429c72516aecbb90b650d3c46225a5c6b68aa8d3e4f3e7653dc6c9"} Nov 28 13:37:43 crc kubenswrapper[4970]: I1128 13:37:43.266577 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" podStartSLOduration=2.266556401 podStartE2EDuration="2.266556401s" podCreationTimestamp="2025-11-28 13:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:37:43.259462393 +0000 UTC m=+1074.112344233" watchObservedRunningTime="2025-11-28 13:37:43.266556401 +0000 UTC m=+1074.119438221" Nov 28 13:37:45 crc kubenswrapper[4970]: I1128 13:37:45.250385 4970 generic.go:334] "Generic (PLEG): container finished" podID="8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" containerID="646cbfa6b7429c72516aecbb90b650d3c46225a5c6b68aa8d3e4f3e7653dc6c9" exitCode=0 Nov 28 13:37:45 crc kubenswrapper[4970]: I1128 13:37:45.250631 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" event={"ID":"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822","Type":"ContainerDied","Data":"646cbfa6b7429c72516aecbb90b650d3c46225a5c6b68aa8d3e4f3e7653dc6c9"} Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.646294 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.773913 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-scripts\") pod \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.773992 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-config-data\") pod \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.774055 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-credential-keys\") pod \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.774095 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-fernet-keys\") pod \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.774116 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnfv4\" (UniqueName: \"kubernetes.io/projected/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-kube-api-access-qnfv4\") pod \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\" (UID: \"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822\") " Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.783146 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-kube-api-access-qnfv4" (OuterVolumeSpecName: "kube-api-access-qnfv4") pod "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" (UID: "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822"). InnerVolumeSpecName "kube-api-access-qnfv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.783257 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" (UID: "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.784754 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" (UID: "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.786076 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-scripts" (OuterVolumeSpecName: "scripts") pod "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" (UID: "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.799869 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-config-data" (OuterVolumeSpecName: "config-data") pod "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" (UID: "8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.876162 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.876243 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.876292 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.876321 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnfv4\" (UniqueName: \"kubernetes.io/projected/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-kube-api-access-qnfv4\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:46 crc kubenswrapper[4970]: I1128 13:37:46.876345 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.272306 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.273614 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-z6qgl" event={"ID":"8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822","Type":"ContainerDied","Data":"de1f5508cd9ae6268e813afb8014a43db8db733b176d97536dc3b47cc1ba74f2"} Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.273689 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de1f5508cd9ae6268e813afb8014a43db8db733b176d97536dc3b47cc1ba74f2" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.767691 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-84k64"] Nov 28 13:37:47 crc kubenswrapper[4970]: E1128 13:37:47.768319 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" containerName="keystone-bootstrap" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.768338 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" containerName="keystone-bootstrap" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.768550 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" containerName="keystone-bootstrap" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.769266 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.775037 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.775290 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-ztgrb" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.776167 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.776366 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.778006 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-84k64"] Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.892564 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-fernet-keys\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.892614 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtlp\" (UniqueName: \"kubernetes.io/projected/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-kube-api-access-pmtlp\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.892670 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-scripts\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.892751 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-credential-keys\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.892778 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-config-data\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.994337 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-credential-keys\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.994425 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-config-data\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.994543 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-fernet-keys\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.994579 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtlp\" (UniqueName: \"kubernetes.io/projected/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-kube-api-access-pmtlp\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.994622 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-scripts\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.999194 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-scripts\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:47 crc kubenswrapper[4970]: I1128 13:37:47.999479 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-fernet-keys\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:48 crc kubenswrapper[4970]: I1128 13:37:47.999976 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-config-data\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:48 crc kubenswrapper[4970]: I1128 13:37:48.000642 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-credential-keys\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:48 crc kubenswrapper[4970]: I1128 13:37:48.017271 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtlp\" (UniqueName: \"kubernetes.io/projected/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-kube-api-access-pmtlp\") pod \"keystone-7db468b797-84k64\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:48 crc kubenswrapper[4970]: I1128 13:37:48.085974 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:48 crc kubenswrapper[4970]: I1128 13:37:48.513432 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-84k64"] Nov 28 13:37:49 crc kubenswrapper[4970]: I1128 13:37:49.297676 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" event={"ID":"0f05ed6a-8d6d-40db-979a-f2d8dec89e36","Type":"ContainerStarted","Data":"1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129"} Nov 28 13:37:49 crc kubenswrapper[4970]: I1128 13:37:49.298133 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" event={"ID":"0f05ed6a-8d6d-40db-979a-f2d8dec89e36","Type":"ContainerStarted","Data":"c30369795ed6b22a46437cc76aa448e00a92e88d297c8cef95ee86e7029ee27d"} Nov 28 13:37:49 crc kubenswrapper[4970]: I1128 13:37:49.298329 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:37:49 crc kubenswrapper[4970]: I1128 13:37:49.331069 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" podStartSLOduration=2.331042588 podStartE2EDuration="2.331042588s" podCreationTimestamp="2025-11-28 13:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:37:49.321965147 +0000 UTC m=+1080.174846987" watchObservedRunningTime="2025-11-28 13:37:49.331042588 +0000 UTC m=+1080.183924438" Nov 28 13:37:51 crc kubenswrapper[4970]: I1128 13:37:51.333404 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:37:51 crc kubenswrapper[4970]: I1128 13:37:51.333883 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:38:19 crc kubenswrapper[4970]: I1128 13:38:19.533484 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.883996 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-cr7f6"] Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.885846 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.894746 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-cr7f6"] Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.900209 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-t6vnf"] Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.900979 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.920567 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-t6vnf"] Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940178 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-config-data\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940268 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-scripts\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940302 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-credential-keys\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940385 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-config-data\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940419 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-scripts\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940462 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vgrc\" (UniqueName: \"kubernetes.io/projected/827ef730-9bfe-4fc4-9a97-720691751c69-kube-api-access-7vgrc\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940485 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-fernet-keys\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940520 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-fernet-keys\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940553 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bxq5\" (UniqueName: \"kubernetes.io/projected/171b3015-7d7c-4526-98fc-099779444c59-kube-api-access-6bxq5\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:20 crc kubenswrapper[4970]: I1128 13:38:20.940604 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-credential-keys\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042037 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-fernet-keys\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042082 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bxq5\" (UniqueName: \"kubernetes.io/projected/171b3015-7d7c-4526-98fc-099779444c59-kube-api-access-6bxq5\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042112 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-credential-keys\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042170 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-config-data\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042259 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-scripts\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042281 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-credential-keys\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042298 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-config-data\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042315 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-scripts\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042338 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vgrc\" (UniqueName: \"kubernetes.io/projected/827ef730-9bfe-4fc4-9a97-720691751c69-kube-api-access-7vgrc\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.042354 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-fernet-keys\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.048459 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-scripts\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.050442 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-credential-keys\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.052794 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-credential-keys\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.053090 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-config-data\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.057258 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-config-data\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.057424 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-fernet-keys\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.063614 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bxq5\" (UniqueName: \"kubernetes.io/projected/171b3015-7d7c-4526-98fc-099779444c59-kube-api-access-6bxq5\") pod \"keystone-7db468b797-t6vnf\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.066782 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-scripts\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.066873 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vgrc\" (UniqueName: \"kubernetes.io/projected/827ef730-9bfe-4fc4-9a97-720691751c69-kube-api-access-7vgrc\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.068132 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-fernet-keys\") pod \"keystone-7db468b797-cr7f6\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.234729 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.247824 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.334488 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.334597 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.563803 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-cr7f6"] Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.639095 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" event={"ID":"827ef730-9bfe-4fc4-9a97-720691751c69","Type":"ContainerStarted","Data":"a335c90583bdcbc13da2a74d958a83940ef7db2f1da21274a2621e0f578755d6"} Nov 28 13:38:21 crc kubenswrapper[4970]: I1128 13:38:21.714461 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-t6vnf"] Nov 28 13:38:21 crc kubenswrapper[4970]: W1128 13:38:21.728444 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171b3015_7d7c_4526_98fc_099779444c59.slice/crio-529c6bdaf0392e57acabfa1901aba9d882f97b20c16b1cdf69c299949545951b WatchSource:0}: Error finding container 529c6bdaf0392e57acabfa1901aba9d882f97b20c16b1cdf69c299949545951b: Status 404 returned error can't find the container with id 529c6bdaf0392e57acabfa1901aba9d882f97b20c16b1cdf69c299949545951b Nov 28 13:38:22 crc kubenswrapper[4970]: I1128 13:38:22.647861 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" event={"ID":"171b3015-7d7c-4526-98fc-099779444c59","Type":"ContainerStarted","Data":"d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff"} Nov 28 13:38:22 crc kubenswrapper[4970]: I1128 13:38:22.648306 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:22 crc kubenswrapper[4970]: I1128 13:38:22.648343 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" event={"ID":"171b3015-7d7c-4526-98fc-099779444c59","Type":"ContainerStarted","Data":"529c6bdaf0392e57acabfa1901aba9d882f97b20c16b1cdf69c299949545951b"} Nov 28 13:38:22 crc kubenswrapper[4970]: I1128 13:38:22.649294 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" event={"ID":"827ef730-9bfe-4fc4-9a97-720691751c69","Type":"ContainerStarted","Data":"8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f"} Nov 28 13:38:22 crc kubenswrapper[4970]: I1128 13:38:22.649656 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:22 crc kubenswrapper[4970]: I1128 13:38:22.687155 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" podStartSLOduration=2.687135402 podStartE2EDuration="2.687135402s" podCreationTimestamp="2025-11-28 13:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:38:22.683267652 +0000 UTC m=+1113.536149452" watchObservedRunningTime="2025-11-28 13:38:22.687135402 +0000 UTC m=+1113.540017212" Nov 28 13:38:50 crc kubenswrapper[4970]: I1128 13:38:50.058184 4970 scope.go:117] "RemoveContainer" containerID="d800e940f815b147b534d14c787ece37979cfbf8e8e38ad9dae376807f2b2f13" Nov 28 13:38:50 crc kubenswrapper[4970]: I1128 13:38:50.087652 4970 scope.go:117] "RemoveContainer" containerID="aaf6bc44cdcec6d739563488d1e130651e1ac2d983d5e350381122cb409815a5" Nov 28 13:38:50 crc kubenswrapper[4970]: I1128 13:38:50.123243 4970 scope.go:117] "RemoveContainer" containerID="0ba2f45b01a634cc40e7e9ce3535b5bcd447fe9de13a002d1f0023c5ace25b31" Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.333336 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.333699 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.333755 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.334470 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd30743a39e211613c4a030816a122a40e3a3cc19bf445b31c5fe37b451ef30e"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.334534 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://cd30743a39e211613c4a030816a122a40e3a3cc19bf445b31c5fe37b451ef30e" gracePeriod=600 Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.893616 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="cd30743a39e211613c4a030816a122a40e3a3cc19bf445b31c5fe37b451ef30e" exitCode=0 Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.893671 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"cd30743a39e211613c4a030816a122a40e3a3cc19bf445b31c5fe37b451ef30e"} Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.893982 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"3b2419e8c0d194a5a29f4c079224199be6f97c77dca32e7413992a8fbfc0b4d2"} Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.894027 4970 scope.go:117] "RemoveContainer" containerID="2ad74f7ddfaa8d711be3a8043f5b9573ad4e845f67f91479eefe9466a3a483c3" Nov 28 13:38:51 crc kubenswrapper[4970]: I1128 13:38:51.923054 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" podStartSLOduration=31.923035078 podStartE2EDuration="31.923035078s" podCreationTimestamp="2025-11-28 13:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:38:22.709509976 +0000 UTC m=+1113.562391776" watchObservedRunningTime="2025-11-28 13:38:51.923035078 +0000 UTC m=+1142.775916878" Nov 28 13:38:52 crc kubenswrapper[4970]: I1128 13:38:52.821974 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:52 crc kubenswrapper[4970]: I1128 13:38:52.887307 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:53 crc kubenswrapper[4970]: I1128 13:38:53.904839 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-t6vnf"] Nov 28 13:38:53 crc kubenswrapper[4970]: I1128 13:38:53.905550 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" podUID="171b3015-7d7c-4526-98fc-099779444c59" containerName="keystone-api" containerID="cri-o://d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff" gracePeriod=30 Nov 28 13:38:53 crc kubenswrapper[4970]: I1128 13:38:53.910034 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-cr7f6"] Nov 28 13:38:53 crc kubenswrapper[4970]: I1128 13:38:53.912008 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" podUID="827ef730-9bfe-4fc4-9a97-720691751c69" containerName="keystone-api" containerID="cri-o://8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f" gracePeriod=30 Nov 28 13:38:55 crc kubenswrapper[4970]: I1128 13:38:55.141169 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-84k64"] Nov 28 13:38:55 crc kubenswrapper[4970]: I1128 13:38:55.141383 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" podUID="0f05ed6a-8d6d-40db-979a-f2d8dec89e36" containerName="keystone-api" containerID="cri-o://1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129" gracePeriod=30 Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.477598 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.483163 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.546766 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-credential-keys\") pod \"171b3015-7d7c-4526-98fc-099779444c59\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.546881 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-fernet-keys\") pod \"171b3015-7d7c-4526-98fc-099779444c59\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.546920 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-scripts\") pod \"171b3015-7d7c-4526-98fc-099779444c59\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.546958 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bxq5\" (UniqueName: \"kubernetes.io/projected/171b3015-7d7c-4526-98fc-099779444c59-kube-api-access-6bxq5\") pod \"171b3015-7d7c-4526-98fc-099779444c59\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.547000 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-config-data\") pod \"171b3015-7d7c-4526-98fc-099779444c59\" (UID: \"171b3015-7d7c-4526-98fc-099779444c59\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.553666 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "171b3015-7d7c-4526-98fc-099779444c59" (UID: "171b3015-7d7c-4526-98fc-099779444c59"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.554323 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "171b3015-7d7c-4526-98fc-099779444c59" (UID: "171b3015-7d7c-4526-98fc-099779444c59"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.554341 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-scripts" (OuterVolumeSpecName: "scripts") pod "171b3015-7d7c-4526-98fc-099779444c59" (UID: "171b3015-7d7c-4526-98fc-099779444c59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.556944 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171b3015-7d7c-4526-98fc-099779444c59-kube-api-access-6bxq5" (OuterVolumeSpecName: "kube-api-access-6bxq5") pod "171b3015-7d7c-4526-98fc-099779444c59" (UID: "171b3015-7d7c-4526-98fc-099779444c59"). InnerVolumeSpecName "kube-api-access-6bxq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.571364 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-config-data" (OuterVolumeSpecName: "config-data") pod "171b3015-7d7c-4526-98fc-099779444c59" (UID: "171b3015-7d7c-4526-98fc-099779444c59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.648789 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-config-data\") pod \"827ef730-9bfe-4fc4-9a97-720691751c69\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649023 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-fernet-keys\") pod \"827ef730-9bfe-4fc4-9a97-720691751c69\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649097 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-scripts\") pod \"827ef730-9bfe-4fc4-9a97-720691751c69\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649248 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vgrc\" (UniqueName: \"kubernetes.io/projected/827ef730-9bfe-4fc4-9a97-720691751c69-kube-api-access-7vgrc\") pod \"827ef730-9bfe-4fc4-9a97-720691751c69\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649340 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-credential-keys\") pod \"827ef730-9bfe-4fc4-9a97-720691751c69\" (UID: \"827ef730-9bfe-4fc4-9a97-720691751c69\") " Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649641 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bxq5\" (UniqueName: \"kubernetes.io/projected/171b3015-7d7c-4526-98fc-099779444c59-kube-api-access-6bxq5\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649709 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649801 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649866 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.649931 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/171b3015-7d7c-4526-98fc-099779444c59-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.651695 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/827ef730-9bfe-4fc4-9a97-720691751c69-kube-api-access-7vgrc" (OuterVolumeSpecName: "kube-api-access-7vgrc") pod "827ef730-9bfe-4fc4-9a97-720691751c69" (UID: "827ef730-9bfe-4fc4-9a97-720691751c69"). InnerVolumeSpecName "kube-api-access-7vgrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.651891 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "827ef730-9bfe-4fc4-9a97-720691751c69" (UID: "827ef730-9bfe-4fc4-9a97-720691751c69"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.652636 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-scripts" (OuterVolumeSpecName: "scripts") pod "827ef730-9bfe-4fc4-9a97-720691751c69" (UID: "827ef730-9bfe-4fc4-9a97-720691751c69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.652702 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "827ef730-9bfe-4fc4-9a97-720691751c69" (UID: "827ef730-9bfe-4fc4-9a97-720691751c69"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.674462 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-config-data" (OuterVolumeSpecName: "config-data") pod "827ef730-9bfe-4fc4-9a97-720691751c69" (UID: "827ef730-9bfe-4fc4-9a97-720691751c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.751128 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.751172 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vgrc\" (UniqueName: \"kubernetes.io/projected/827ef730-9bfe-4fc4-9a97-720691751c69-kube-api-access-7vgrc\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.751187 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.751198 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.751209 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/827ef730-9bfe-4fc4-9a97-720691751c69-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.957968 4970 generic.go:334] "Generic (PLEG): container finished" podID="171b3015-7d7c-4526-98fc-099779444c59" containerID="d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff" exitCode=0 Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.958011 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" event={"ID":"171b3015-7d7c-4526-98fc-099779444c59","Type":"ContainerDied","Data":"d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff"} Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.958027 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.958048 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-t6vnf" event={"ID":"171b3015-7d7c-4526-98fc-099779444c59","Type":"ContainerDied","Data":"529c6bdaf0392e57acabfa1901aba9d882f97b20c16b1cdf69c299949545951b"} Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.958067 4970 scope.go:117] "RemoveContainer" containerID="d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.959521 4970 generic.go:334] "Generic (PLEG): container finished" podID="827ef730-9bfe-4fc4-9a97-720691751c69" containerID="8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f" exitCode=0 Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.959547 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" event={"ID":"827ef730-9bfe-4fc4-9a97-720691751c69","Type":"ContainerDied","Data":"8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f"} Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.959561 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" event={"ID":"827ef730-9bfe-4fc4-9a97-720691751c69","Type":"ContainerDied","Data":"a335c90583bdcbc13da2a74d958a83940ef7db2f1da21274a2621e0f578755d6"} Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.959566 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-cr7f6" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.975104 4970 scope.go:117] "RemoveContainer" containerID="d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff" Nov 28 13:38:57 crc kubenswrapper[4970]: E1128 13:38:57.976611 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff\": container with ID starting with d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff not found: ID does not exist" containerID="d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.976641 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff"} err="failed to get container status \"d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff\": rpc error: code = NotFound desc = could not find container \"d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff\": container with ID starting with d0e22ed8f97c683bd8e48789ef5e3939417ec7db10d0f741149ae3544e7fc1ff not found: ID does not exist" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.976663 4970 scope.go:117] "RemoveContainer" containerID="8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f" Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.986628 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-t6vnf"] Nov 28 13:38:57 crc kubenswrapper[4970]: I1128 13:38:57.995613 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-t6vnf"] Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.005861 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-cr7f6"] Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.010298 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-cr7f6"] Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.010535 4970 scope.go:117] "RemoveContainer" containerID="8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f" Nov 28 13:38:58 crc kubenswrapper[4970]: E1128 13:38:58.010867 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f\": container with ID starting with 8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f not found: ID does not exist" containerID="8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.010891 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f"} err="failed to get container status \"8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f\": rpc error: code = NotFound desc = could not find container \"8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f\": container with ID starting with 8c34c9a279ed20316796976e583fc62a004af3b37e13613775beb3b29bb75c4f not found: ID does not exist" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.717400 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.862821 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-config-data\") pod \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.862931 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-credential-keys\") pod \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.862960 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-fernet-keys\") pod \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.863026 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtlp\" (UniqueName: \"kubernetes.io/projected/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-kube-api-access-pmtlp\") pod \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.863090 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-scripts\") pod \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\" (UID: \"0f05ed6a-8d6d-40db-979a-f2d8dec89e36\") " Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.867247 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0f05ed6a-8d6d-40db-979a-f2d8dec89e36" (UID: "0f05ed6a-8d6d-40db-979a-f2d8dec89e36"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.867340 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0f05ed6a-8d6d-40db-979a-f2d8dec89e36" (UID: "0f05ed6a-8d6d-40db-979a-f2d8dec89e36"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.870815 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-scripts" (OuterVolumeSpecName: "scripts") pod "0f05ed6a-8d6d-40db-979a-f2d8dec89e36" (UID: "0f05ed6a-8d6d-40db-979a-f2d8dec89e36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.871107 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-kube-api-access-pmtlp" (OuterVolumeSpecName: "kube-api-access-pmtlp") pod "0f05ed6a-8d6d-40db-979a-f2d8dec89e36" (UID: "0f05ed6a-8d6d-40db-979a-f2d8dec89e36"). InnerVolumeSpecName "kube-api-access-pmtlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.884615 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-config-data" (OuterVolumeSpecName: "config-data") pod "0f05ed6a-8d6d-40db-979a-f2d8dec89e36" (UID: "0f05ed6a-8d6d-40db-979a-f2d8dec89e36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.964717 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.964765 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.964787 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.964809 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtlp\" (UniqueName: \"kubernetes.io/projected/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-kube-api-access-pmtlp\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.964828 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f05ed6a-8d6d-40db-979a-f2d8dec89e36-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.970209 4970 generic.go:334] "Generic (PLEG): container finished" podID="0f05ed6a-8d6d-40db-979a-f2d8dec89e36" containerID="1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129" exitCode=0 Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.970276 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.970305 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" event={"ID":"0f05ed6a-8d6d-40db-979a-f2d8dec89e36","Type":"ContainerDied","Data":"1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129"} Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.970335 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7db468b797-84k64" event={"ID":"0f05ed6a-8d6d-40db-979a-f2d8dec89e36","Type":"ContainerDied","Data":"c30369795ed6b22a46437cc76aa448e00a92e88d297c8cef95ee86e7029ee27d"} Nov 28 13:38:58 crc kubenswrapper[4970]: I1128 13:38:58.970356 4970 scope.go:117] "RemoveContainer" containerID="1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.010102 4970 scope.go:117] "RemoveContainer" containerID="1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.010272 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-84k64"] Nov 28 13:38:59 crc kubenswrapper[4970]: E1128 13:38:59.010737 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129\": container with ID starting with 1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129 not found: ID does not exist" containerID="1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.010787 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129"} err="failed to get container status \"1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129\": rpc error: code = NotFound desc = could not find container \"1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129\": container with ID starting with 1551c31bbebde1b59002fd20730e7cc8255df9c3d077273806293f371f9e6129 not found: ID does not exist" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.022456 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7db468b797-84k64"] Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.314549 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d7h7j"] Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.327536 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-d7h7j"] Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.339270 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-z6qgl"] Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.345291 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-z6qgl"] Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.369714 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonec65e-account-delete-flx6r"] Nov 28 13:38:59 crc kubenswrapper[4970]: E1128 13:38:59.369944 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f05ed6a-8d6d-40db-979a-f2d8dec89e36" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.369960 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f05ed6a-8d6d-40db-979a-f2d8dec89e36" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: E1128 13:38:59.369973 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171b3015-7d7c-4526-98fc-099779444c59" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.369979 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="171b3015-7d7c-4526-98fc-099779444c59" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: E1128 13:38:59.369991 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827ef730-9bfe-4fc4-9a97-720691751c69" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.369998 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="827ef730-9bfe-4fc4-9a97-720691751c69" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.370102 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="827ef730-9bfe-4fc4-9a97-720691751c69" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.370118 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="171b3015-7d7c-4526-98fc-099779444c59" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.370126 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f05ed6a-8d6d-40db-979a-f2d8dec89e36" containerName="keystone-api" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.370520 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.390427 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f05ed6a-8d6d-40db-979a-f2d8dec89e36" path="/var/lib/kubelet/pods/0f05ed6a-8d6d-40db-979a-f2d8dec89e36/volumes" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.391374 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171b3015-7d7c-4526-98fc-099779444c59" path="/var/lib/kubelet/pods/171b3015-7d7c-4526-98fc-099779444c59/volumes" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.391950 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="827ef730-9bfe-4fc4-9a97-720691751c69" path="/var/lib/kubelet/pods/827ef730-9bfe-4fc4-9a97-720691751c69/volumes" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.392588 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822" path="/var/lib/kubelet/pods/8dce7ed1-5bf7-4d47-b9d5-00a9b4cfb822/volumes" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.393928 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd150ac-9406-4874-beb0-1de8b4501c1b" path="/var/lib/kubelet/pods/fbd150ac-9406-4874-beb0-1de8b4501c1b/volumes" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.394476 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonec65e-account-delete-flx6r"] Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.473362 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ed9407-5afb-40eb-b526-8203d172b030-operator-scripts\") pod \"keystonec65e-account-delete-flx6r\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.473519 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdb2\" (UniqueName: \"kubernetes.io/projected/e0ed9407-5afb-40eb-b526-8203d172b030-kube-api-access-4mdb2\") pod \"keystonec65e-account-delete-flx6r\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.575320 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdb2\" (UniqueName: \"kubernetes.io/projected/e0ed9407-5afb-40eb-b526-8203d172b030-kube-api-access-4mdb2\") pod \"keystonec65e-account-delete-flx6r\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.575481 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ed9407-5afb-40eb-b526-8203d172b030-operator-scripts\") pod \"keystonec65e-account-delete-flx6r\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.576805 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ed9407-5afb-40eb-b526-8203d172b030-operator-scripts\") pod \"keystonec65e-account-delete-flx6r\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.599339 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdb2\" (UniqueName: \"kubernetes.io/projected/e0ed9407-5afb-40eb-b526-8203d172b030-kube-api-access-4mdb2\") pod \"keystonec65e-account-delete-flx6r\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:38:59 crc kubenswrapper[4970]: I1128 13:38:59.685829 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:39:00 crc kubenswrapper[4970]: I1128 13:39:00.243135 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonec65e-account-delete-flx6r"] Nov 28 13:39:00 crc kubenswrapper[4970]: W1128 13:39:00.249615 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ed9407_5afb_40eb_b526_8203d172b030.slice/crio-8b380530ded82fca6cefbcbecd5093b78ba76c94ce3b3194e810e73eb05bfa1d WatchSource:0}: Error finding container 8b380530ded82fca6cefbcbecd5093b78ba76c94ce3b3194e810e73eb05bfa1d: Status 404 returned error can't find the container with id 8b380530ded82fca6cefbcbecd5093b78ba76c94ce3b3194e810e73eb05bfa1d Nov 28 13:39:00 crc kubenswrapper[4970]: I1128 13:39:00.998944 4970 generic.go:334] "Generic (PLEG): container finished" podID="e0ed9407-5afb-40eb-b526-8203d172b030" containerID="fe75df7012727427cb85fb6507dde88d14cc7f28c697916077fd01fe888ccea5" exitCode=0 Nov 28 13:39:00 crc kubenswrapper[4970]: I1128 13:39:00.999011 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" event={"ID":"e0ed9407-5afb-40eb-b526-8203d172b030","Type":"ContainerDied","Data":"fe75df7012727427cb85fb6507dde88d14cc7f28c697916077fd01fe888ccea5"} Nov 28 13:39:00 crc kubenswrapper[4970]: I1128 13:39:00.999281 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" event={"ID":"e0ed9407-5afb-40eb-b526-8203d172b030","Type":"ContainerStarted","Data":"8b380530ded82fca6cefbcbecd5093b78ba76c94ce3b3194e810e73eb05bfa1d"} Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.366587 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.521660 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdb2\" (UniqueName: \"kubernetes.io/projected/e0ed9407-5afb-40eb-b526-8203d172b030-kube-api-access-4mdb2\") pod \"e0ed9407-5afb-40eb-b526-8203d172b030\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.521849 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ed9407-5afb-40eb-b526-8203d172b030-operator-scripts\") pod \"e0ed9407-5afb-40eb-b526-8203d172b030\" (UID: \"e0ed9407-5afb-40eb-b526-8203d172b030\") " Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.523262 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ed9407-5afb-40eb-b526-8203d172b030-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0ed9407-5afb-40eb-b526-8203d172b030" (UID: "e0ed9407-5afb-40eb-b526-8203d172b030"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.534485 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ed9407-5afb-40eb-b526-8203d172b030-kube-api-access-4mdb2" (OuterVolumeSpecName: "kube-api-access-4mdb2") pod "e0ed9407-5afb-40eb-b526-8203d172b030" (UID: "e0ed9407-5afb-40eb-b526-8203d172b030"). InnerVolumeSpecName "kube-api-access-4mdb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.622920 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mdb2\" (UniqueName: \"kubernetes.io/projected/e0ed9407-5afb-40eb-b526-8203d172b030-kube-api-access-4mdb2\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:02 crc kubenswrapper[4970]: I1128 13:39:02.622955 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ed9407-5afb-40eb-b526-8203d172b030-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:03 crc kubenswrapper[4970]: I1128 13:39:03.016446 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" event={"ID":"e0ed9407-5afb-40eb-b526-8203d172b030","Type":"ContainerDied","Data":"8b380530ded82fca6cefbcbecd5093b78ba76c94ce3b3194e810e73eb05bfa1d"} Nov 28 13:39:03 crc kubenswrapper[4970]: I1128 13:39:03.016523 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b380530ded82fca6cefbcbecd5093b78ba76c94ce3b3194e810e73eb05bfa1d" Nov 28 13:39:03 crc kubenswrapper[4970]: I1128 13:39:03.016490 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonec65e-account-delete-flx6r" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.400332 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6hvg9"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.404646 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6hvg9"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.409505 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.413393 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonec65e-account-delete-flx6r"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.417183 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-c65e-account-create-update-28mpl"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.420909 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonec65e-account-delete-flx6r"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.483744 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vc4f6"] Nov 28 13:39:04 crc kubenswrapper[4970]: E1128 13:39:04.483991 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ed9407-5afb-40eb-b526-8203d172b030" containerName="mariadb-account-delete" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.484007 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ed9407-5afb-40eb-b526-8203d172b030" containerName="mariadb-account-delete" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.484126 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ed9407-5afb-40eb-b526-8203d172b030" containerName="mariadb-account-delete" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.484668 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.510786 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vc4f6"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.564335 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzpqn\" (UniqueName: \"kubernetes.io/projected/8d33b453-5415-4302-a3a6-77a74219cf38-kube-api-access-mzpqn\") pod \"keystone-db-create-vc4f6\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.564409 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d33b453-5415-4302-a3a6-77a74219cf38-operator-scripts\") pod \"keystone-db-create-vc4f6\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.589596 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.590367 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.593410 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.597178 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw"] Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.665180 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzpqn\" (UniqueName: \"kubernetes.io/projected/8d33b453-5415-4302-a3a6-77a74219cf38-kube-api-access-mzpqn\") pod \"keystone-db-create-vc4f6\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.665255 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d33b453-5415-4302-a3a6-77a74219cf38-operator-scripts\") pod \"keystone-db-create-vc4f6\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.665958 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d33b453-5415-4302-a3a6-77a74219cf38-operator-scripts\") pod \"keystone-db-create-vc4f6\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.679729 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzpqn\" (UniqueName: \"kubernetes.io/projected/8d33b453-5415-4302-a3a6-77a74219cf38-kube-api-access-mzpqn\") pod \"keystone-db-create-vc4f6\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.766690 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d4672-d9ce-46b4-829a-b595fc3909ca-operator-scripts\") pod \"keystone-66ba-account-create-update-q4plw\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.766862 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p6m7\" (UniqueName: \"kubernetes.io/projected/5f3d4672-d9ce-46b4-829a-b595fc3909ca-kube-api-access-4p6m7\") pod \"keystone-66ba-account-create-update-q4plw\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.797454 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.868427 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d4672-d9ce-46b4-829a-b595fc3909ca-operator-scripts\") pod \"keystone-66ba-account-create-update-q4plw\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.868591 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p6m7\" (UniqueName: \"kubernetes.io/projected/5f3d4672-d9ce-46b4-829a-b595fc3909ca-kube-api-access-4p6m7\") pod \"keystone-66ba-account-create-update-q4plw\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.869768 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d4672-d9ce-46b4-829a-b595fc3909ca-operator-scripts\") pod \"keystone-66ba-account-create-update-q4plw\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.887148 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p6m7\" (UniqueName: \"kubernetes.io/projected/5f3d4672-d9ce-46b4-829a-b595fc3909ca-kube-api-access-4p6m7\") pod \"keystone-66ba-account-create-update-q4plw\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:04 crc kubenswrapper[4970]: I1128 13:39:04.909851 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:05 crc kubenswrapper[4970]: I1128 13:39:05.022537 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vc4f6"] Nov 28 13:39:05 crc kubenswrapper[4970]: W1128 13:39:05.029734 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d33b453_5415_4302_a3a6_77a74219cf38.slice/crio-16ae85821a0641a53d3071e8639106486c8ea1423594a29296f1c9612068271d WatchSource:0}: Error finding container 16ae85821a0641a53d3071e8639106486c8ea1423594a29296f1c9612068271d: Status 404 returned error can't find the container with id 16ae85821a0641a53d3071e8639106486c8ea1423594a29296f1c9612068271d Nov 28 13:39:05 crc kubenswrapper[4970]: I1128 13:39:05.312822 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw"] Nov 28 13:39:05 crc kubenswrapper[4970]: W1128 13:39:05.341082 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f3d4672_d9ce_46b4_829a_b595fc3909ca.slice/crio-2e71b29ae65df83600fe6f88cb1a9abda9e88e8b793dbb92d8da9c51320d5314 WatchSource:0}: Error finding container 2e71b29ae65df83600fe6f88cb1a9abda9e88e8b793dbb92d8da9c51320d5314: Status 404 returned error can't find the container with id 2e71b29ae65df83600fe6f88cb1a9abda9e88e8b793dbb92d8da9c51320d5314 Nov 28 13:39:05 crc kubenswrapper[4970]: I1128 13:39:05.392936 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aec0740-f76c-415e-927f-484e717f3aa1" path="/var/lib/kubelet/pods/6aec0740-f76c-415e-927f-484e717f3aa1/volumes" Nov 28 13:39:05 crc kubenswrapper[4970]: I1128 13:39:05.393616 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ed9407-5afb-40eb-b526-8203d172b030" path="/var/lib/kubelet/pods/e0ed9407-5afb-40eb-b526-8203d172b030/volumes" Nov 28 13:39:05 crc kubenswrapper[4970]: I1128 13:39:05.394092 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd27cede-0acf-40e3-b456-12c99ba203ae" path="/var/lib/kubelet/pods/fd27cede-0acf-40e3-b456-12c99ba203ae/volumes" Nov 28 13:39:06 crc kubenswrapper[4970]: I1128 13:39:06.046406 4970 generic.go:334] "Generic (PLEG): container finished" podID="8d33b453-5415-4302-a3a6-77a74219cf38" containerID="93944f73dc0636948b106db60dba8d56aa4e910ad8e953783c5ea33264ecf117" exitCode=0 Nov 28 13:39:06 crc kubenswrapper[4970]: I1128 13:39:06.046488 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" event={"ID":"8d33b453-5415-4302-a3a6-77a74219cf38","Type":"ContainerDied","Data":"93944f73dc0636948b106db60dba8d56aa4e910ad8e953783c5ea33264ecf117"} Nov 28 13:39:06 crc kubenswrapper[4970]: I1128 13:39:06.046552 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" event={"ID":"8d33b453-5415-4302-a3a6-77a74219cf38","Type":"ContainerStarted","Data":"16ae85821a0641a53d3071e8639106486c8ea1423594a29296f1c9612068271d"} Nov 28 13:39:06 crc kubenswrapper[4970]: I1128 13:39:06.049704 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" event={"ID":"5f3d4672-d9ce-46b4-829a-b595fc3909ca","Type":"ContainerStarted","Data":"8d5290d1b1410eb750de6e70caeb91fa0c9236d2a5f735a059577ae1ed60c3ef"} Nov 28 13:39:06 crc kubenswrapper[4970]: I1128 13:39:06.049761 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" event={"ID":"5f3d4672-d9ce-46b4-829a-b595fc3909ca","Type":"ContainerStarted","Data":"2e71b29ae65df83600fe6f88cb1a9abda9e88e8b793dbb92d8da9c51320d5314"} Nov 28 13:39:06 crc kubenswrapper[4970]: I1128 13:39:06.091082 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" podStartSLOduration=2.091058022 podStartE2EDuration="2.091058022s" podCreationTimestamp="2025-11-28 13:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:39:06.087732318 +0000 UTC m=+1156.940614208" watchObservedRunningTime="2025-11-28 13:39:06.091058022 +0000 UTC m=+1156.943939862" Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.343637 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.517021 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d33b453-5415-4302-a3a6-77a74219cf38-operator-scripts\") pod \"8d33b453-5415-4302-a3a6-77a74219cf38\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.517151 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzpqn\" (UniqueName: \"kubernetes.io/projected/8d33b453-5415-4302-a3a6-77a74219cf38-kube-api-access-mzpqn\") pod \"8d33b453-5415-4302-a3a6-77a74219cf38\" (UID: \"8d33b453-5415-4302-a3a6-77a74219cf38\") " Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.517828 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d33b453-5415-4302-a3a6-77a74219cf38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d33b453-5415-4302-a3a6-77a74219cf38" (UID: "8d33b453-5415-4302-a3a6-77a74219cf38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.539431 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d33b453-5415-4302-a3a6-77a74219cf38-kube-api-access-mzpqn" (OuterVolumeSpecName: "kube-api-access-mzpqn") pod "8d33b453-5415-4302-a3a6-77a74219cf38" (UID: "8d33b453-5415-4302-a3a6-77a74219cf38"). InnerVolumeSpecName "kube-api-access-mzpqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.619040 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d33b453-5415-4302-a3a6-77a74219cf38-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:07 crc kubenswrapper[4970]: I1128 13:39:07.619073 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzpqn\" (UniqueName: \"kubernetes.io/projected/8d33b453-5415-4302-a3a6-77a74219cf38-kube-api-access-mzpqn\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:08 crc kubenswrapper[4970]: I1128 13:39:08.068132 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" event={"ID":"8d33b453-5415-4302-a3a6-77a74219cf38","Type":"ContainerDied","Data":"16ae85821a0641a53d3071e8639106486c8ea1423594a29296f1c9612068271d"} Nov 28 13:39:08 crc kubenswrapper[4970]: I1128 13:39:08.068197 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16ae85821a0641a53d3071e8639106486c8ea1423594a29296f1c9612068271d" Nov 28 13:39:08 crc kubenswrapper[4970]: I1128 13:39:08.068162 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vc4f6" Nov 28 13:39:08 crc kubenswrapper[4970]: I1128 13:39:08.070041 4970 generic.go:334] "Generic (PLEG): container finished" podID="5f3d4672-d9ce-46b4-829a-b595fc3909ca" containerID="8d5290d1b1410eb750de6e70caeb91fa0c9236d2a5f735a059577ae1ed60c3ef" exitCode=0 Nov 28 13:39:08 crc kubenswrapper[4970]: I1128 13:39:08.070090 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" event={"ID":"5f3d4672-d9ce-46b4-829a-b595fc3909ca","Type":"ContainerDied","Data":"8d5290d1b1410eb750de6e70caeb91fa0c9236d2a5f735a059577ae1ed60c3ef"} Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.374407 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.550947 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d4672-d9ce-46b4-829a-b595fc3909ca-operator-scripts\") pod \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.551001 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p6m7\" (UniqueName: \"kubernetes.io/projected/5f3d4672-d9ce-46b4-829a-b595fc3909ca-kube-api-access-4p6m7\") pod \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\" (UID: \"5f3d4672-d9ce-46b4-829a-b595fc3909ca\") " Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.551728 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f3d4672-d9ce-46b4-829a-b595fc3909ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f3d4672-d9ce-46b4-829a-b595fc3909ca" (UID: "5f3d4672-d9ce-46b4-829a-b595fc3909ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.552892 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f3d4672-d9ce-46b4-829a-b595fc3909ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.576550 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3d4672-d9ce-46b4-829a-b595fc3909ca-kube-api-access-4p6m7" (OuterVolumeSpecName: "kube-api-access-4p6m7") pod "5f3d4672-d9ce-46b4-829a-b595fc3909ca" (UID: "5f3d4672-d9ce-46b4-829a-b595fc3909ca"). InnerVolumeSpecName "kube-api-access-4p6m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:09 crc kubenswrapper[4970]: I1128 13:39:09.653783 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p6m7\" (UniqueName: \"kubernetes.io/projected/5f3d4672-d9ce-46b4-829a-b595fc3909ca-kube-api-access-4p6m7\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:10 crc kubenswrapper[4970]: I1128 13:39:10.089151 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" event={"ID":"5f3d4672-d9ce-46b4-829a-b595fc3909ca","Type":"ContainerDied","Data":"2e71b29ae65df83600fe6f88cb1a9abda9e88e8b793dbb92d8da9c51320d5314"} Nov 28 13:39:10 crc kubenswrapper[4970]: I1128 13:39:10.089561 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e71b29ae65df83600fe6f88cb1a9abda9e88e8b793dbb92d8da9c51320d5314" Nov 28 13:39:10 crc kubenswrapper[4970]: I1128 13:39:10.089417 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.168251 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4frfv"] Nov 28 13:39:15 crc kubenswrapper[4970]: E1128 13:39:15.169315 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d33b453-5415-4302-a3a6-77a74219cf38" containerName="mariadb-database-create" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.169338 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d33b453-5415-4302-a3a6-77a74219cf38" containerName="mariadb-database-create" Nov 28 13:39:15 crc kubenswrapper[4970]: E1128 13:39:15.169377 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3d4672-d9ce-46b4-829a-b595fc3909ca" containerName="mariadb-account-create-update" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.169390 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3d4672-d9ce-46b4-829a-b595fc3909ca" containerName="mariadb-account-create-update" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.169588 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d33b453-5415-4302-a3a6-77a74219cf38" containerName="mariadb-database-create" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.169609 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3d4672-d9ce-46b4-829a-b595fc3909ca" containerName="mariadb-account-create-update" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.170286 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.172802 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.173393 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.178779 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.178869 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-fnwjv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.179011 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.179444 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4frfv"] Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.336958 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-config-data\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.337022 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-combined-ca-bundle\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.337078 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twncx\" (UniqueName: \"kubernetes.io/projected/dbbf378e-cef0-45b9-a48e-a22b783477b7-kube-api-access-twncx\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.438783 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twncx\" (UniqueName: \"kubernetes.io/projected/dbbf378e-cef0-45b9-a48e-a22b783477b7-kube-api-access-twncx\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.438968 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-config-data\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.439020 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-combined-ca-bundle\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.449480 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-config-data\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.451046 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-combined-ca-bundle\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.482360 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twncx\" (UniqueName: \"kubernetes.io/projected/dbbf378e-cef0-45b9-a48e-a22b783477b7-kube-api-access-twncx\") pod \"keystone-db-sync-4frfv\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:15 crc kubenswrapper[4970]: I1128 13:39:15.502902 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:16 crc kubenswrapper[4970]: I1128 13:39:16.023456 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4frfv"] Nov 28 13:39:16 crc kubenswrapper[4970]: I1128 13:39:16.142066 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" event={"ID":"dbbf378e-cef0-45b9-a48e-a22b783477b7","Type":"ContainerStarted","Data":"14b621833f9843d9a01b4f4801e812c7ca2f1380576dc90a3df1445e43437e62"} Nov 28 13:39:17 crc kubenswrapper[4970]: I1128 13:39:17.150264 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" event={"ID":"dbbf378e-cef0-45b9-a48e-a22b783477b7","Type":"ContainerStarted","Data":"877de299d69ff0ee836fcef3c0c71152a9538f3cfe202ca93873aab38f1c641f"} Nov 28 13:39:17 crc kubenswrapper[4970]: I1128 13:39:17.185239 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" podStartSLOduration=2.185189919 podStartE2EDuration="2.185189919s" podCreationTimestamp="2025-11-28 13:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:39:17.175028931 +0000 UTC m=+1168.027910741" watchObservedRunningTime="2025-11-28 13:39:17.185189919 +0000 UTC m=+1168.038071739" Nov 28 13:39:18 crc kubenswrapper[4970]: I1128 13:39:18.158368 4970 generic.go:334] "Generic (PLEG): container finished" podID="dbbf378e-cef0-45b9-a48e-a22b783477b7" containerID="877de299d69ff0ee836fcef3c0c71152a9538f3cfe202ca93873aab38f1c641f" exitCode=0 Nov 28 13:39:18 crc kubenswrapper[4970]: I1128 13:39:18.158409 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" event={"ID":"dbbf378e-cef0-45b9-a48e-a22b783477b7","Type":"ContainerDied","Data":"877de299d69ff0ee836fcef3c0c71152a9538f3cfe202ca93873aab38f1c641f"} Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.434830 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.496657 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twncx\" (UniqueName: \"kubernetes.io/projected/dbbf378e-cef0-45b9-a48e-a22b783477b7-kube-api-access-twncx\") pod \"dbbf378e-cef0-45b9-a48e-a22b783477b7\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.496731 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-combined-ca-bundle\") pod \"dbbf378e-cef0-45b9-a48e-a22b783477b7\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.496771 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-config-data\") pod \"dbbf378e-cef0-45b9-a48e-a22b783477b7\" (UID: \"dbbf378e-cef0-45b9-a48e-a22b783477b7\") " Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.502427 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbbf378e-cef0-45b9-a48e-a22b783477b7-kube-api-access-twncx" (OuterVolumeSpecName: "kube-api-access-twncx") pod "dbbf378e-cef0-45b9-a48e-a22b783477b7" (UID: "dbbf378e-cef0-45b9-a48e-a22b783477b7"). InnerVolumeSpecName "kube-api-access-twncx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.515154 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbbf378e-cef0-45b9-a48e-a22b783477b7" (UID: "dbbf378e-cef0-45b9-a48e-a22b783477b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.542166 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-config-data" (OuterVolumeSpecName: "config-data") pod "dbbf378e-cef0-45b9-a48e-a22b783477b7" (UID: "dbbf378e-cef0-45b9-a48e-a22b783477b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.598986 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twncx\" (UniqueName: \"kubernetes.io/projected/dbbf378e-cef0-45b9-a48e-a22b783477b7-kube-api-access-twncx\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.599060 4970 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:19 crc kubenswrapper[4970]: I1128 13:39:19.599088 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbf378e-cef0-45b9-a48e-a22b783477b7-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.185183 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" event={"ID":"dbbf378e-cef0-45b9-a48e-a22b783477b7","Type":"ContainerDied","Data":"14b621833f9843d9a01b4f4801e812c7ca2f1380576dc90a3df1445e43437e62"} Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.185530 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b621833f9843d9a01b4f4801e812c7ca2f1380576dc90a3df1445e43437e62" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.185589 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-4frfv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.660123 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pmplv"] Nov 28 13:39:20 crc kubenswrapper[4970]: E1128 13:39:20.661034 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbf378e-cef0-45b9-a48e-a22b783477b7" containerName="keystone-db-sync" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.661060 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbf378e-cef0-45b9-a48e-a22b783477b7" containerName="keystone-db-sync" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.661352 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbf378e-cef0-45b9-a48e-a22b783477b7" containerName="keystone-db-sync" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.661950 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.670187 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.670752 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.671481 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.672098 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.672400 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-fnwjv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.674824 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.697116 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pmplv"] Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.714974 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-credential-keys\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.715036 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-scripts\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.715127 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-combined-ca-bundle\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.715278 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-fernet-keys\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.715378 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v2l2\" (UniqueName: \"kubernetes.io/projected/e6adc5b4-dbc2-4c07-923b-593f89d097ac-kube-api-access-6v2l2\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.715470 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-config-data\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.816500 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-combined-ca-bundle\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.816547 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-fernet-keys\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.816566 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v2l2\" (UniqueName: \"kubernetes.io/projected/e6adc5b4-dbc2-4c07-923b-593f89d097ac-kube-api-access-6v2l2\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.816593 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-config-data\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.816618 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-credential-keys\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.816638 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-scripts\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.820683 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-credential-keys\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.820914 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-fernet-keys\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.820953 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-combined-ca-bundle\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.821016 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-config-data\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.826843 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-scripts\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.830956 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v2l2\" (UniqueName: \"kubernetes.io/projected/e6adc5b4-dbc2-4c07-923b-593f89d097ac-kube-api-access-6v2l2\") pod \"keystone-bootstrap-pmplv\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:20 crc kubenswrapper[4970]: I1128 13:39:20.995425 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:21 crc kubenswrapper[4970]: I1128 13:39:21.241590 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pmplv"] Nov 28 13:39:21 crc kubenswrapper[4970]: W1128 13:39:21.250819 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6adc5b4_dbc2_4c07_923b_593f89d097ac.slice/crio-039f4f836105177e9a0b7fda152015fb755e3b3039ee80940d67cf14cf53a2f0 WatchSource:0}: Error finding container 039f4f836105177e9a0b7fda152015fb755e3b3039ee80940d67cf14cf53a2f0: Status 404 returned error can't find the container with id 039f4f836105177e9a0b7fda152015fb755e3b3039ee80940d67cf14cf53a2f0 Nov 28 13:39:22 crc kubenswrapper[4970]: I1128 13:39:22.202902 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" event={"ID":"e6adc5b4-dbc2-4c07-923b-593f89d097ac","Type":"ContainerStarted","Data":"d3be7ca8abac0727af7754c96865eff56935a731cae22281811af54886785ab2"} Nov 28 13:39:22 crc kubenswrapper[4970]: I1128 13:39:22.203475 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" event={"ID":"e6adc5b4-dbc2-4c07-923b-593f89d097ac","Type":"ContainerStarted","Data":"039f4f836105177e9a0b7fda152015fb755e3b3039ee80940d67cf14cf53a2f0"} Nov 28 13:39:22 crc kubenswrapper[4970]: I1128 13:39:22.244718 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" podStartSLOduration=2.244691743 podStartE2EDuration="2.244691743s" podCreationTimestamp="2025-11-28 13:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:39:22.229920075 +0000 UTC m=+1173.082801885" watchObservedRunningTime="2025-11-28 13:39:22.244691743 +0000 UTC m=+1173.097573583" Nov 28 13:39:24 crc kubenswrapper[4970]: I1128 13:39:24.221044 4970 generic.go:334] "Generic (PLEG): container finished" podID="e6adc5b4-dbc2-4c07-923b-593f89d097ac" containerID="d3be7ca8abac0727af7754c96865eff56935a731cae22281811af54886785ab2" exitCode=0 Nov 28 13:39:24 crc kubenswrapper[4970]: I1128 13:39:24.221161 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" event={"ID":"e6adc5b4-dbc2-4c07-923b-593f89d097ac","Type":"ContainerDied","Data":"d3be7ca8abac0727af7754c96865eff56935a731cae22281811af54886785ab2"} Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.508883 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.585546 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v2l2\" (UniqueName: \"kubernetes.io/projected/e6adc5b4-dbc2-4c07-923b-593f89d097ac-kube-api-access-6v2l2\") pod \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.585635 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-credential-keys\") pod \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.585714 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-combined-ca-bundle\") pod \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.585766 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-config-data\") pod \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.585813 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-fernet-keys\") pod \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.585848 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-scripts\") pod \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\" (UID: \"e6adc5b4-dbc2-4c07-923b-593f89d097ac\") " Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.591680 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e6adc5b4-dbc2-4c07-923b-593f89d097ac" (UID: "e6adc5b4-dbc2-4c07-923b-593f89d097ac"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.592068 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6adc5b4-dbc2-4c07-923b-593f89d097ac-kube-api-access-6v2l2" (OuterVolumeSpecName: "kube-api-access-6v2l2") pod "e6adc5b4-dbc2-4c07-923b-593f89d097ac" (UID: "e6adc5b4-dbc2-4c07-923b-593f89d097ac"). InnerVolumeSpecName "kube-api-access-6v2l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.592712 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e6adc5b4-dbc2-4c07-923b-593f89d097ac" (UID: "e6adc5b4-dbc2-4c07-923b-593f89d097ac"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.592798 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-scripts" (OuterVolumeSpecName: "scripts") pod "e6adc5b4-dbc2-4c07-923b-593f89d097ac" (UID: "e6adc5b4-dbc2-4c07-923b-593f89d097ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.605384 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6adc5b4-dbc2-4c07-923b-593f89d097ac" (UID: "e6adc5b4-dbc2-4c07-923b-593f89d097ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.606163 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-config-data" (OuterVolumeSpecName: "config-data") pod "e6adc5b4-dbc2-4c07-923b-593f89d097ac" (UID: "e6adc5b4-dbc2-4c07-923b-593f89d097ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.687096 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.687135 4970 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.687154 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.687170 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.687187 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6adc5b4-dbc2-4c07-923b-593f89d097ac-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:25 crc kubenswrapper[4970]: I1128 13:39:25.687202 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v2l2\" (UniqueName: \"kubernetes.io/projected/e6adc5b4-dbc2-4c07-923b-593f89d097ac-kube-api-access-6v2l2\") on node \"crc\" DevicePath \"\"" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.238956 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" event={"ID":"e6adc5b4-dbc2-4c07-923b-593f89d097ac","Type":"ContainerDied","Data":"039f4f836105177e9a0b7fda152015fb755e3b3039ee80940d67cf14cf53a2f0"} Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.239452 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="039f4f836105177e9a0b7fda152015fb755e3b3039ee80940d67cf14cf53a2f0" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.239052 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-pmplv" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.621502 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg"] Nov 28 13:39:26 crc kubenswrapper[4970]: E1128 13:39:26.621823 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6adc5b4-dbc2-4c07-923b-593f89d097ac" containerName="keystone-bootstrap" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.621840 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6adc5b4-dbc2-4c07-923b-593f89d097ac" containerName="keystone-bootstrap" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.621998 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6adc5b4-dbc2-4c07-923b-593f89d097ac" containerName="keystone-bootstrap" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.622635 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.624553 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.624836 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.625091 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.625797 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.626121 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.626764 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-fnwjv" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.626893 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.644758 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg"] Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.701868 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-fernet-keys\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.701913 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-scripts\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.701947 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhlf7\" (UniqueName: \"kubernetes.io/projected/2967a1ca-b9f6-4f48-a371-c46581e8d68a-kube-api-access-dhlf7\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.701970 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-combined-ca-bundle\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.702368 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-credential-keys\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.702602 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-config-data\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.702817 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-public-tls-certs\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.702880 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-internal-tls-certs\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803300 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-public-tls-certs\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803341 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-internal-tls-certs\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803382 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-fernet-keys\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803403 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-scripts\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803423 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhlf7\" (UniqueName: \"kubernetes.io/projected/2967a1ca-b9f6-4f48-a371-c46581e8d68a-kube-api-access-dhlf7\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803443 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-combined-ca-bundle\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803473 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-credential-keys\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.803499 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-config-data\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.809720 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-public-tls-certs\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.809748 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-config-data\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.809799 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-credential-keys\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.809897 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-scripts\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.810320 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-combined-ca-bundle\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.816971 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-fernet-keys\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.820464 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-internal-tls-certs\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.830511 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhlf7\" (UniqueName: \"kubernetes.io/projected/2967a1ca-b9f6-4f48-a371-c46581e8d68a-kube-api-access-dhlf7\") pod \"keystone-6d5764bf8b-f5qjg\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:26 crc kubenswrapper[4970]: I1128 13:39:26.943280 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:27 crc kubenswrapper[4970]: I1128 13:39:27.373035 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg"] Nov 28 13:39:28 crc kubenswrapper[4970]: I1128 13:39:28.259329 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" event={"ID":"2967a1ca-b9f6-4f48-a371-c46581e8d68a","Type":"ContainerStarted","Data":"f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6"} Nov 28 13:39:28 crc kubenswrapper[4970]: I1128 13:39:28.259652 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" event={"ID":"2967a1ca-b9f6-4f48-a371-c46581e8d68a","Type":"ContainerStarted","Data":"14a513c897f1835c6d2c2f1bb539072077d48ac58debed3616183ee61096c0da"} Nov 28 13:39:28 crc kubenswrapper[4970]: I1128 13:39:28.259670 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:28 crc kubenswrapper[4970]: I1128 13:39:28.282097 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" podStartSLOduration=2.282068643 podStartE2EDuration="2.282068643s" podCreationTimestamp="2025-11-28 13:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:39:28.277939536 +0000 UTC m=+1179.130821376" watchObservedRunningTime="2025-11-28 13:39:28.282068643 +0000 UTC m=+1179.134950463" Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.392069 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.905790 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4frfv"] Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.915379 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-4frfv"] Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.921634 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pmplv"] Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.934023 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-pmplv"] Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.948075 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg"] Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.948477 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" podUID="2967a1ca-b9f6-4f48-a371-c46581e8d68a" containerName="keystone-api" containerID="cri-o://f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6" gracePeriod=30 Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.957119 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone66ba-account-delete-pf297"] Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.958376 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:58 crc kubenswrapper[4970]: I1128 13:39:58.974463 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone66ba-account-delete-pf297"] Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.120419 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z77bv\" (UniqueName: \"kubernetes.io/projected/8d4e078f-7801-41ac-a4b0-474048f86e2b-kube-api-access-z77bv\") pod \"keystone66ba-account-delete-pf297\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.120528 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4e078f-7801-41ac-a4b0-474048f86e2b-operator-scripts\") pod \"keystone66ba-account-delete-pf297\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.221816 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z77bv\" (UniqueName: \"kubernetes.io/projected/8d4e078f-7801-41ac-a4b0-474048f86e2b-kube-api-access-z77bv\") pod \"keystone66ba-account-delete-pf297\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.222093 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4e078f-7801-41ac-a4b0-474048f86e2b-operator-scripts\") pod \"keystone66ba-account-delete-pf297\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.223583 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4e078f-7801-41ac-a4b0-474048f86e2b-operator-scripts\") pod \"keystone66ba-account-delete-pf297\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.248318 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z77bv\" (UniqueName: \"kubernetes.io/projected/8d4e078f-7801-41ac-a4b0-474048f86e2b-kube-api-access-z77bv\") pod \"keystone66ba-account-delete-pf297\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.274750 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.393027 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbbf378e-cef0-45b9-a48e-a22b783477b7" path="/var/lib/kubelet/pods/dbbf378e-cef0-45b9-a48e-a22b783477b7/volumes" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.394271 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6adc5b4-dbc2-4c07-923b-593f89d097ac" path="/var/lib/kubelet/pods/e6adc5b4-dbc2-4c07-923b-593f89d097ac/volumes" Nov 28 13:39:59 crc kubenswrapper[4970]: I1128 13:39:59.740718 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone66ba-account-delete-pf297"] Nov 28 13:40:00 crc kubenswrapper[4970]: I1128 13:40:00.760590 4970 generic.go:334] "Generic (PLEG): container finished" podID="8d4e078f-7801-41ac-a4b0-474048f86e2b" containerID="20ebd83a1a3d7e6f7a0f39226452867a290148296adaef862f741b6ea6619d01" exitCode=0 Nov 28 13:40:00 crc kubenswrapper[4970]: I1128 13:40:00.760666 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" event={"ID":"8d4e078f-7801-41ac-a4b0-474048f86e2b","Type":"ContainerDied","Data":"20ebd83a1a3d7e6f7a0f39226452867a290148296adaef862f741b6ea6619d01"} Nov 28 13:40:00 crc kubenswrapper[4970]: I1128 13:40:00.761071 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" event={"ID":"8d4e078f-7801-41ac-a4b0-474048f86e2b","Type":"ContainerStarted","Data":"9e7f5360463333a1bdd2e2a57e71510628bf0bcf15f8ee0529df822167d1ff9a"} Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.100379 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.269979 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4e078f-7801-41ac-a4b0-474048f86e2b-operator-scripts\") pod \"8d4e078f-7801-41ac-a4b0-474048f86e2b\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.270392 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z77bv\" (UniqueName: \"kubernetes.io/projected/8d4e078f-7801-41ac-a4b0-474048f86e2b-kube-api-access-z77bv\") pod \"8d4e078f-7801-41ac-a4b0-474048f86e2b\" (UID: \"8d4e078f-7801-41ac-a4b0-474048f86e2b\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.271303 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4e078f-7801-41ac-a4b0-474048f86e2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d4e078f-7801-41ac-a4b0-474048f86e2b" (UID: "8d4e078f-7801-41ac-a4b0-474048f86e2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.281903 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4e078f-7801-41ac-a4b0-474048f86e2b-kube-api-access-z77bv" (OuterVolumeSpecName: "kube-api-access-z77bv") pod "8d4e078f-7801-41ac-a4b0-474048f86e2b" (UID: "8d4e078f-7801-41ac-a4b0-474048f86e2b"). InnerVolumeSpecName "kube-api-access-z77bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.372338 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4e078f-7801-41ac-a4b0-474048f86e2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.372388 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z77bv\" (UniqueName: \"kubernetes.io/projected/8d4e078f-7801-41ac-a4b0-474048f86e2b-kube-api-access-z77bv\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.428449 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575193 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-config-data\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575411 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhlf7\" (UniqueName: \"kubernetes.io/projected/2967a1ca-b9f6-4f48-a371-c46581e8d68a-kube-api-access-dhlf7\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575569 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-combined-ca-bundle\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575664 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-internal-tls-certs\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575764 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-public-tls-certs\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575915 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-credential-keys\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.575994 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-scripts\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.576042 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-fernet-keys\") pod \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\" (UID: \"2967a1ca-b9f6-4f48-a371-c46581e8d68a\") " Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.582242 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-scripts" (OuterVolumeSpecName: "scripts") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.582258 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2967a1ca-b9f6-4f48-a371-c46581e8d68a-kube-api-access-dhlf7" (OuterVolumeSpecName: "kube-api-access-dhlf7") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "kube-api-access-dhlf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.582997 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.583625 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.598405 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.601371 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-config-data" (OuterVolumeSpecName: "config-data") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.615475 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.638126 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2967a1ca-b9f6-4f48-a371-c46581e8d68a" (UID: "2967a1ca-b9f6-4f48-a371-c46581e8d68a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.678958 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679014 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679036 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679054 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679074 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhlf7\" (UniqueName: \"kubernetes.io/projected/2967a1ca-b9f6-4f48-a371-c46581e8d68a-kube-api-access-dhlf7\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679094 4970 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679112 4970 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.679130 4970 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2967a1ca-b9f6-4f48-a371-c46581e8d68a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.778155 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.778145 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone66ba-account-delete-pf297" event={"ID":"8d4e078f-7801-41ac-a4b0-474048f86e2b","Type":"ContainerDied","Data":"9e7f5360463333a1bdd2e2a57e71510628bf0bcf15f8ee0529df822167d1ff9a"} Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.778437 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7f5360463333a1bdd2e2a57e71510628bf0bcf15f8ee0529df822167d1ff9a" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.780528 4970 generic.go:334] "Generic (PLEG): container finished" podID="2967a1ca-b9f6-4f48-a371-c46581e8d68a" containerID="f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6" exitCode=0 Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.780589 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" event={"ID":"2967a1ca-b9f6-4f48-a371-c46581e8d68a","Type":"ContainerDied","Data":"f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6"} Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.780663 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" event={"ID":"2967a1ca-b9f6-4f48-a371-c46581e8d68a","Type":"ContainerDied","Data":"14a513c897f1835c6d2c2f1bb539072077d48ac58debed3616183ee61096c0da"} Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.780699 4970 scope.go:117] "RemoveContainer" containerID="f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.780609 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.824804 4970 scope.go:117] "RemoveContainer" containerID="f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6" Nov 28 13:40:02 crc kubenswrapper[4970]: E1128 13:40:02.825376 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6\": container with ID starting with f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6 not found: ID does not exist" containerID="f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.825411 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6"} err="failed to get container status \"f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6\": rpc error: code = NotFound desc = could not find container \"f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6\": container with ID starting with f350f2dfe2cf611c586794231eba8667a15cf34662ad64cff43f4359d7f4f9b6 not found: ID does not exist" Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.827361 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg"] Nov 28 13:40:02 crc kubenswrapper[4970]: I1128 13:40:02.832138 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-6d5764bf8b-f5qjg"] Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.395277 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2967a1ca-b9f6-4f48-a371-c46581e8d68a" path="/var/lib/kubelet/pods/2967a1ca-b9f6-4f48-a371-c46581e8d68a/volumes" Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.975593 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone66ba-account-delete-pf297"] Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.981556 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone66ba-account-delete-pf297"] Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.986831 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw"] Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.991938 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vc4f6"] Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.995400 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-66ba-account-create-update-q4plw"] Nov 28 13:40:03 crc kubenswrapper[4970]: I1128 13:40:03.998828 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vc4f6"] Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.055200 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-klrz9"] Nov 28 13:40:04 crc kubenswrapper[4970]: E1128 13:40:04.055601 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2967a1ca-b9f6-4f48-a371-c46581e8d68a" containerName="keystone-api" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.055627 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="2967a1ca-b9f6-4f48-a371-c46581e8d68a" containerName="keystone-api" Nov 28 13:40:04 crc kubenswrapper[4970]: E1128 13:40:04.055664 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4e078f-7801-41ac-a4b0-474048f86e2b" containerName="mariadb-account-delete" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.055679 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4e078f-7801-41ac-a4b0-474048f86e2b" containerName="mariadb-account-delete" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.055842 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="2967a1ca-b9f6-4f48-a371-c46581e8d68a" containerName="keystone-api" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.055866 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4e078f-7801-41ac-a4b0-474048f86e2b" containerName="mariadb-account-delete" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.056384 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.060565 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-klrz9"] Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.116404 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02171036-5aa9-4a47-9469-5faa9a495e23-operator-scripts\") pod \"keystone-db-create-klrz9\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.116503 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj24r\" (UniqueName: \"kubernetes.io/projected/02171036-5aa9-4a47-9469-5faa9a495e23-kube-api-access-cj24r\") pod \"keystone-db-create-klrz9\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.166456 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5656-account-create-update-hwplx"] Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.167732 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.170457 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.181327 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5656-account-create-update-hwplx"] Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.218907 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj24r\" (UniqueName: \"kubernetes.io/projected/02171036-5aa9-4a47-9469-5faa9a495e23-kube-api-access-cj24r\") pod \"keystone-db-create-klrz9\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.218993 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ncc8\" (UniqueName: \"kubernetes.io/projected/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-kube-api-access-8ncc8\") pod \"keystone-5656-account-create-update-hwplx\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.219081 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-operator-scripts\") pod \"keystone-5656-account-create-update-hwplx\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.219493 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02171036-5aa9-4a47-9469-5faa9a495e23-operator-scripts\") pod \"keystone-db-create-klrz9\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.220816 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02171036-5aa9-4a47-9469-5faa9a495e23-operator-scripts\") pod \"keystone-db-create-klrz9\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.248348 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj24r\" (UniqueName: \"kubernetes.io/projected/02171036-5aa9-4a47-9469-5faa9a495e23-kube-api-access-cj24r\") pod \"keystone-db-create-klrz9\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.321904 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ncc8\" (UniqueName: \"kubernetes.io/projected/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-kube-api-access-8ncc8\") pod \"keystone-5656-account-create-update-hwplx\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.322014 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-operator-scripts\") pod \"keystone-5656-account-create-update-hwplx\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.323093 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-operator-scripts\") pod \"keystone-5656-account-create-update-hwplx\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.351115 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ncc8\" (UniqueName: \"kubernetes.io/projected/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-kube-api-access-8ncc8\") pod \"keystone-5656-account-create-update-hwplx\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.372823 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.484041 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.628020 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-klrz9"] Nov 28 13:40:04 crc kubenswrapper[4970]: W1128 13:40:04.636935 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02171036_5aa9_4a47_9469_5faa9a495e23.slice/crio-05f464ee94afb9cf061b04a6a93d5c3ddd3b49b03690c8aef7a5d27e13ed7747 WatchSource:0}: Error finding container 05f464ee94afb9cf061b04a6a93d5c3ddd3b49b03690c8aef7a5d27e13ed7747: Status 404 returned error can't find the container with id 05f464ee94afb9cf061b04a6a93d5c3ddd3b49b03690c8aef7a5d27e13ed7747 Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.715099 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5656-account-create-update-hwplx"] Nov 28 13:40:04 crc kubenswrapper[4970]: W1128 13:40:04.721961 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4e8dd6a_a78b_4f89_bab5_20fd7e8da6a9.slice/crio-06b50df83f7f26ed1c3a8a78eb20a8fdf7d3c507d1cccd9beb1ad5ef38725198 WatchSource:0}: Error finding container 06b50df83f7f26ed1c3a8a78eb20a8fdf7d3c507d1cccd9beb1ad5ef38725198: Status 404 returned error can't find the container with id 06b50df83f7f26ed1c3a8a78eb20a8fdf7d3c507d1cccd9beb1ad5ef38725198 Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.801610 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" event={"ID":"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9","Type":"ContainerStarted","Data":"06b50df83f7f26ed1c3a8a78eb20a8fdf7d3c507d1cccd9beb1ad5ef38725198"} Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.803521 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-klrz9" event={"ID":"02171036-5aa9-4a47-9469-5faa9a495e23","Type":"ContainerStarted","Data":"82715e6f6166f873f9da6a9ee4768eb74cec82c4e44e80311b9edafe2c0e8cdd"} Nov 28 13:40:04 crc kubenswrapper[4970]: I1128 13:40:04.803568 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-klrz9" event={"ID":"02171036-5aa9-4a47-9469-5faa9a495e23","Type":"ContainerStarted","Data":"05f464ee94afb9cf061b04a6a93d5c3ddd3b49b03690c8aef7a5d27e13ed7747"} Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.396105 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3d4672-d9ce-46b4-829a-b595fc3909ca" path="/var/lib/kubelet/pods/5f3d4672-d9ce-46b4-829a-b595fc3909ca/volumes" Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.397613 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d33b453-5415-4302-a3a6-77a74219cf38" path="/var/lib/kubelet/pods/8d33b453-5415-4302-a3a6-77a74219cf38/volumes" Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.398533 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4e078f-7801-41ac-a4b0-474048f86e2b" path="/var/lib/kubelet/pods/8d4e078f-7801-41ac-a4b0-474048f86e2b/volumes" Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.816533 4970 generic.go:334] "Generic (PLEG): container finished" podID="a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" containerID="61f3b0e29d43cf715e6e77ba454db2738d3b86164f0d058812d4d752197ddb83" exitCode=0 Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.816611 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" event={"ID":"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9","Type":"ContainerDied","Data":"61f3b0e29d43cf715e6e77ba454db2738d3b86164f0d058812d4d752197ddb83"} Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.819993 4970 generic.go:334] "Generic (PLEG): container finished" podID="02171036-5aa9-4a47-9469-5faa9a495e23" containerID="82715e6f6166f873f9da6a9ee4768eb74cec82c4e44e80311b9edafe2c0e8cdd" exitCode=0 Nov 28 13:40:05 crc kubenswrapper[4970]: I1128 13:40:05.820047 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-klrz9" event={"ID":"02171036-5aa9-4a47-9469-5faa9a495e23","Type":"ContainerDied","Data":"82715e6f6166f873f9da6a9ee4768eb74cec82c4e44e80311b9edafe2c0e8cdd"} Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.286301 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.292235 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.366390 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj24r\" (UniqueName: \"kubernetes.io/projected/02171036-5aa9-4a47-9469-5faa9a495e23-kube-api-access-cj24r\") pod \"02171036-5aa9-4a47-9469-5faa9a495e23\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.366512 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02171036-5aa9-4a47-9469-5faa9a495e23-operator-scripts\") pod \"02171036-5aa9-4a47-9469-5faa9a495e23\" (UID: \"02171036-5aa9-4a47-9469-5faa9a495e23\") " Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.366544 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-operator-scripts\") pod \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.366602 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ncc8\" (UniqueName: \"kubernetes.io/projected/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-kube-api-access-8ncc8\") pod \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\" (UID: \"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9\") " Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.367413 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" (UID: "a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.367512 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02171036-5aa9-4a47-9469-5faa9a495e23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02171036-5aa9-4a47-9469-5faa9a495e23" (UID: "02171036-5aa9-4a47-9469-5faa9a495e23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.371898 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02171036-5aa9-4a47-9469-5faa9a495e23-kube-api-access-cj24r" (OuterVolumeSpecName: "kube-api-access-cj24r") pod "02171036-5aa9-4a47-9469-5faa9a495e23" (UID: "02171036-5aa9-4a47-9469-5faa9a495e23"). InnerVolumeSpecName "kube-api-access-cj24r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.374492 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-kube-api-access-8ncc8" (OuterVolumeSpecName: "kube-api-access-8ncc8") pod "a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" (UID: "a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9"). InnerVolumeSpecName "kube-api-access-8ncc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.468674 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj24r\" (UniqueName: \"kubernetes.io/projected/02171036-5aa9-4a47-9469-5faa9a495e23-kube-api-access-cj24r\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.468705 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02171036-5aa9-4a47-9469-5faa9a495e23-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.468715 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.468724 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ncc8\" (UniqueName: \"kubernetes.io/projected/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9-kube-api-access-8ncc8\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.843942 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.843982 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5656-account-create-update-hwplx" event={"ID":"a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9","Type":"ContainerDied","Data":"06b50df83f7f26ed1c3a8a78eb20a8fdf7d3c507d1cccd9beb1ad5ef38725198"} Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.844101 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b50df83f7f26ed1c3a8a78eb20a8fdf7d3c507d1cccd9beb1ad5ef38725198" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.846740 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-klrz9" event={"ID":"02171036-5aa9-4a47-9469-5faa9a495e23","Type":"ContainerDied","Data":"05f464ee94afb9cf061b04a6a93d5c3ddd3b49b03690c8aef7a5d27e13ed7747"} Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.846788 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f464ee94afb9cf061b04a6a93d5c3ddd3b49b03690c8aef7a5d27e13ed7747" Nov 28 13:40:07 crc kubenswrapper[4970]: I1128 13:40:07.846850 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-klrz9" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.743496 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-h25mj"] Nov 28 13:40:09 crc kubenswrapper[4970]: E1128 13:40:09.744158 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" containerName="mariadb-account-create-update" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.744181 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" containerName="mariadb-account-create-update" Nov 28 13:40:09 crc kubenswrapper[4970]: E1128 13:40:09.744241 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02171036-5aa9-4a47-9469-5faa9a495e23" containerName="mariadb-database-create" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.744256 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="02171036-5aa9-4a47-9469-5faa9a495e23" containerName="mariadb-database-create" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.744473 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" containerName="mariadb-account-create-update" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.744511 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="02171036-5aa9-4a47-9469-5faa9a495e23" containerName="mariadb-database-create" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.745257 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.747248 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.747305 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.749123 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-46bdv" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.751109 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.762369 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-h25mj"] Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.908464 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6859145c-f5f5-4e5f-a04c-aea7081007be-config-data\") pod \"keystone-db-sync-h25mj\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:09 crc kubenswrapper[4970]: I1128 13:40:09.908585 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnk9\" (UniqueName: \"kubernetes.io/projected/6859145c-f5f5-4e5f-a04c-aea7081007be-kube-api-access-8mnk9\") pod \"keystone-db-sync-h25mj\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.010435 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6859145c-f5f5-4e5f-a04c-aea7081007be-config-data\") pod \"keystone-db-sync-h25mj\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.010549 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnk9\" (UniqueName: \"kubernetes.io/projected/6859145c-f5f5-4e5f-a04c-aea7081007be-kube-api-access-8mnk9\") pod \"keystone-db-sync-h25mj\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.017854 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6859145c-f5f5-4e5f-a04c-aea7081007be-config-data\") pod \"keystone-db-sync-h25mj\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.037579 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnk9\" (UniqueName: \"kubernetes.io/projected/6859145c-f5f5-4e5f-a04c-aea7081007be-kube-api-access-8mnk9\") pod \"keystone-db-sync-h25mj\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.072292 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.532609 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-h25mj"] Nov 28 13:40:10 crc kubenswrapper[4970]: I1128 13:40:10.875922 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" event={"ID":"6859145c-f5f5-4e5f-a04c-aea7081007be","Type":"ContainerStarted","Data":"1128aebe220737ff2e2d775f194cbfdb69b4c7a868a4c6d474dac50a8fe30e9e"} Nov 28 13:40:11 crc kubenswrapper[4970]: I1128 13:40:11.898081 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" event={"ID":"6859145c-f5f5-4e5f-a04c-aea7081007be","Type":"ContainerStarted","Data":"1f3aa2553a84343080af5009b2640fd359598a4ef84e55cfabfa5180558a8ff7"} Nov 28 13:40:11 crc kubenswrapper[4970]: I1128 13:40:11.926949 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" podStartSLOduration=2.9269293259999998 podStartE2EDuration="2.926929326s" podCreationTimestamp="2025-11-28 13:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:11.922778469 +0000 UTC m=+1222.775660289" watchObservedRunningTime="2025-11-28 13:40:11.926929326 +0000 UTC m=+1222.779811136" Nov 28 13:40:12 crc kubenswrapper[4970]: I1128 13:40:12.907203 4970 generic.go:334] "Generic (PLEG): container finished" podID="6859145c-f5f5-4e5f-a04c-aea7081007be" containerID="1f3aa2553a84343080af5009b2640fd359598a4ef84e55cfabfa5180558a8ff7" exitCode=0 Nov 28 13:40:12 crc kubenswrapper[4970]: I1128 13:40:12.907296 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" event={"ID":"6859145c-f5f5-4e5f-a04c-aea7081007be","Type":"ContainerDied","Data":"1f3aa2553a84343080af5009b2640fd359598a4ef84e55cfabfa5180558a8ff7"} Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.222937 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.394907 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6859145c-f5f5-4e5f-a04c-aea7081007be-config-data\") pod \"6859145c-f5f5-4e5f-a04c-aea7081007be\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.394963 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mnk9\" (UniqueName: \"kubernetes.io/projected/6859145c-f5f5-4e5f-a04c-aea7081007be-kube-api-access-8mnk9\") pod \"6859145c-f5f5-4e5f-a04c-aea7081007be\" (UID: \"6859145c-f5f5-4e5f-a04c-aea7081007be\") " Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.402068 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6859145c-f5f5-4e5f-a04c-aea7081007be-kube-api-access-8mnk9" (OuterVolumeSpecName: "kube-api-access-8mnk9") pod "6859145c-f5f5-4e5f-a04c-aea7081007be" (UID: "6859145c-f5f5-4e5f-a04c-aea7081007be"). InnerVolumeSpecName "kube-api-access-8mnk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.442805 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6859145c-f5f5-4e5f-a04c-aea7081007be-config-data" (OuterVolumeSpecName: "config-data") pod "6859145c-f5f5-4e5f-a04c-aea7081007be" (UID: "6859145c-f5f5-4e5f-a04c-aea7081007be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.496439 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6859145c-f5f5-4e5f-a04c-aea7081007be-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.496479 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mnk9\" (UniqueName: \"kubernetes.io/projected/6859145c-f5f5-4e5f-a04c-aea7081007be-kube-api-access-8mnk9\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.928873 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" event={"ID":"6859145c-f5f5-4e5f-a04c-aea7081007be","Type":"ContainerDied","Data":"1128aebe220737ff2e2d775f194cbfdb69b4c7a868a4c6d474dac50a8fe30e9e"} Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.928967 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1128aebe220737ff2e2d775f194cbfdb69b4c7a868a4c6d474dac50a8fe30e9e" Nov 28 13:40:14 crc kubenswrapper[4970]: I1128 13:40:14.928969 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-h25mj" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.111355 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-k4vqx"] Nov 28 13:40:15 crc kubenswrapper[4970]: E1128 13:40:15.111769 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859145c-f5f5-4e5f-a04c-aea7081007be" containerName="keystone-db-sync" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.111797 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859145c-f5f5-4e5f-a04c-aea7081007be" containerName="keystone-db-sync" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.112842 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="6859145c-f5f5-4e5f-a04c-aea7081007be" containerName="keystone-db-sync" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.113527 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.116919 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.117537 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.117856 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.119589 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.121055 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-46bdv" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.135757 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-k4vqx"] Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.307938 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-fernet-keys\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.308013 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2vp\" (UniqueName: \"kubernetes.io/projected/97a3c098-4b92-44d4-affd-790cb32a5a3a-kube-api-access-4d2vp\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.308080 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-config-data\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.308143 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-credential-keys\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.308178 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-scripts\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.409558 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-credential-keys\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.409603 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-scripts\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.409628 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-fernet-keys\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.409802 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2vp\" (UniqueName: \"kubernetes.io/projected/97a3c098-4b92-44d4-affd-790cb32a5a3a-kube-api-access-4d2vp\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.410082 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-config-data\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.414736 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-scripts\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.415146 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-credential-keys\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.415203 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-fernet-keys\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.417868 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-config-data\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.444695 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2vp\" (UniqueName: \"kubernetes.io/projected/97a3c098-4b92-44d4-affd-790cb32a5a3a-kube-api-access-4d2vp\") pod \"keystone-bootstrap-k4vqx\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.445794 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.702054 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-k4vqx"] Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.938897 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" event={"ID":"97a3c098-4b92-44d4-affd-790cb32a5a3a","Type":"ContainerStarted","Data":"dca9d8e843ae6879f29de64545c0c1a8b168e16f9250bf61751bc68178232cab"} Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.939331 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" event={"ID":"97a3c098-4b92-44d4-affd-790cb32a5a3a","Type":"ContainerStarted","Data":"81dcc82300343a37e2835bf53a6f991f7afc6e1e95bf30682f9152f15436c9ba"} Nov 28 13:40:15 crc kubenswrapper[4970]: I1128 13:40:15.964387 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" podStartSLOduration=0.964372634 podStartE2EDuration="964.372634ms" podCreationTimestamp="2025-11-28 13:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:15.960615878 +0000 UTC m=+1226.813497678" watchObservedRunningTime="2025-11-28 13:40:15.964372634 +0000 UTC m=+1226.817254434" Nov 28 13:40:18 crc kubenswrapper[4970]: I1128 13:40:18.964921 4970 generic.go:334] "Generic (PLEG): container finished" podID="97a3c098-4b92-44d4-affd-790cb32a5a3a" containerID="dca9d8e843ae6879f29de64545c0c1a8b168e16f9250bf61751bc68178232cab" exitCode=0 Nov 28 13:40:18 crc kubenswrapper[4970]: I1128 13:40:18.965030 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" event={"ID":"97a3c098-4b92-44d4-affd-790cb32a5a3a","Type":"ContainerDied","Data":"dca9d8e843ae6879f29de64545c0c1a8b168e16f9250bf61751bc68178232cab"} Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.307078 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.487057 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-scripts\") pod \"97a3c098-4b92-44d4-affd-790cb32a5a3a\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.487478 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d2vp\" (UniqueName: \"kubernetes.io/projected/97a3c098-4b92-44d4-affd-790cb32a5a3a-kube-api-access-4d2vp\") pod \"97a3c098-4b92-44d4-affd-790cb32a5a3a\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.487542 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-fernet-keys\") pod \"97a3c098-4b92-44d4-affd-790cb32a5a3a\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.487627 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-credential-keys\") pod \"97a3c098-4b92-44d4-affd-790cb32a5a3a\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.487720 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-config-data\") pod \"97a3c098-4b92-44d4-affd-790cb32a5a3a\" (UID: \"97a3c098-4b92-44d4-affd-790cb32a5a3a\") " Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.493473 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-scripts" (OuterVolumeSpecName: "scripts") pod "97a3c098-4b92-44d4-affd-790cb32a5a3a" (UID: "97a3c098-4b92-44d4-affd-790cb32a5a3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.494025 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "97a3c098-4b92-44d4-affd-790cb32a5a3a" (UID: "97a3c098-4b92-44d4-affd-790cb32a5a3a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.494069 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a3c098-4b92-44d4-affd-790cb32a5a3a-kube-api-access-4d2vp" (OuterVolumeSpecName: "kube-api-access-4d2vp") pod "97a3c098-4b92-44d4-affd-790cb32a5a3a" (UID: "97a3c098-4b92-44d4-affd-790cb32a5a3a"). InnerVolumeSpecName "kube-api-access-4d2vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.500967 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "97a3c098-4b92-44d4-affd-790cb32a5a3a" (UID: "97a3c098-4b92-44d4-affd-790cb32a5a3a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.524150 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-config-data" (OuterVolumeSpecName: "config-data") pod "97a3c098-4b92-44d4-affd-790cb32a5a3a" (UID: "97a3c098-4b92-44d4-affd-790cb32a5a3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.590093 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.590146 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.590158 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d2vp\" (UniqueName: \"kubernetes.io/projected/97a3c098-4b92-44d4-affd-790cb32a5a3a-kube-api-access-4d2vp\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.590169 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.590178 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97a3c098-4b92-44d4-affd-790cb32a5a3a-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.986570 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" event={"ID":"97a3c098-4b92-44d4-affd-790cb32a5a3a","Type":"ContainerDied","Data":"81dcc82300343a37e2835bf53a6f991f7afc6e1e95bf30682f9152f15436c9ba"} Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.986627 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81dcc82300343a37e2835bf53a6f991f7afc6e1e95bf30682f9152f15436c9ba" Nov 28 13:40:20 crc kubenswrapper[4970]: I1128 13:40:20.986683 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-k4vqx" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.091789 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt"] Nov 28 13:40:21 crc kubenswrapper[4970]: E1128 13:40:21.092172 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a3c098-4b92-44d4-affd-790cb32a5a3a" containerName="keystone-bootstrap" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.092192 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a3c098-4b92-44d4-affd-790cb32a5a3a" containerName="keystone-bootstrap" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.092402 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a3c098-4b92-44d4-affd-790cb32a5a3a" containerName="keystone-bootstrap" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.092935 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.095065 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.095329 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.095423 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-46bdv" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.097183 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-credential-keys\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.097283 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-fernet-keys\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.097311 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzcv\" (UniqueName: \"kubernetes.io/projected/79e80fee-34f1-48bf-884d-4fd32b593378-kube-api-access-fgzcv\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.097337 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-scripts\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.097386 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-config-data\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.101814 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.118208 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt"] Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.198659 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-config-data\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.199049 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-credential-keys\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.199189 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-fernet-keys\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.199291 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzcv\" (UniqueName: \"kubernetes.io/projected/79e80fee-34f1-48bf-884d-4fd32b593378-kube-api-access-fgzcv\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.199366 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-scripts\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.202754 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-credential-keys\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.203680 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-config-data\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.204331 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-fernet-keys\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.206665 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-scripts\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.217092 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzcv\" (UniqueName: \"kubernetes.io/projected/79e80fee-34f1-48bf-884d-4fd32b593378-kube-api-access-fgzcv\") pod \"keystone-d6cfdb8b6-jnkxt\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.420768 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.909962 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt"] Nov 28 13:40:21 crc kubenswrapper[4970]: I1128 13:40:21.995862 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" event={"ID":"79e80fee-34f1-48bf-884d-4fd32b593378","Type":"ContainerStarted","Data":"d01750a1c02e468ce55bdde2191d7d056715f08b8872e97a195c26ca1522c1f9"} Nov 28 13:40:23 crc kubenswrapper[4970]: I1128 13:40:23.007368 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" event={"ID":"79e80fee-34f1-48bf-884d-4fd32b593378","Type":"ContainerStarted","Data":"322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc"} Nov 28 13:40:23 crc kubenswrapper[4970]: I1128 13:40:23.007721 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:40:23 crc kubenswrapper[4970]: I1128 13:40:23.037840 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" podStartSLOduration=2.037817291 podStartE2EDuration="2.037817291s" podCreationTimestamp="2025-11-28 13:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:40:23.030607128 +0000 UTC m=+1233.883488968" watchObservedRunningTime="2025-11-28 13:40:23.037817291 +0000 UTC m=+1233.890699121" Nov 28 13:40:51 crc kubenswrapper[4970]: I1128 13:40:51.334448 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:40:51 crc kubenswrapper[4970]: I1128 13:40:51.335110 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:40:52 crc kubenswrapper[4970]: I1128 13:40:52.881068 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.567247 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-h25mj"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.578333 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-h25mj"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.587734 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-k4vqx"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.605978 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-k4vqx"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.616846 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.617176 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" podUID="79e80fee-34f1-48bf-884d-4fd32b593378" containerName="keystone-api" containerID="cri-o://322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc" gracePeriod=30 Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.639159 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone5656-account-delete-sbckc"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.641352 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.651267 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5656-account-delete-sbckc"] Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.714196 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnn7t\" (UniqueName: \"kubernetes.io/projected/02bf0f86-96f4-41cd-960e-2c9f43f30da6-kube-api-access-bnn7t\") pod \"keystone5656-account-delete-sbckc\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.714523 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bf0f86-96f4-41cd-960e-2c9f43f30da6-operator-scripts\") pod \"keystone5656-account-delete-sbckc\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.815488 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnn7t\" (UniqueName: \"kubernetes.io/projected/02bf0f86-96f4-41cd-960e-2c9f43f30da6-kube-api-access-bnn7t\") pod \"keystone5656-account-delete-sbckc\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.815552 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bf0f86-96f4-41cd-960e-2c9f43f30da6-operator-scripts\") pod \"keystone5656-account-delete-sbckc\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.816380 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bf0f86-96f4-41cd-960e-2c9f43f30da6-operator-scripts\") pod \"keystone5656-account-delete-sbckc\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.839257 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnn7t\" (UniqueName: \"kubernetes.io/projected/02bf0f86-96f4-41cd-960e-2c9f43f30da6-kube-api-access-bnn7t\") pod \"keystone5656-account-delete-sbckc\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:08 crc kubenswrapper[4970]: I1128 13:41:08.961705 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:09 crc kubenswrapper[4970]: I1128 13:41:09.389521 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6859145c-f5f5-4e5f-a04c-aea7081007be" path="/var/lib/kubelet/pods/6859145c-f5f5-4e5f-a04c-aea7081007be/volumes" Nov 28 13:41:09 crc kubenswrapper[4970]: I1128 13:41:09.390573 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a3c098-4b92-44d4-affd-790cb32a5a3a" path="/var/lib/kubelet/pods/97a3c098-4b92-44d4-affd-790cb32a5a3a/volumes" Nov 28 13:41:09 crc kubenswrapper[4970]: I1128 13:41:09.428488 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5656-account-delete-sbckc"] Nov 28 13:41:09 crc kubenswrapper[4970]: I1128 13:41:09.444188 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" event={"ID":"02bf0f86-96f4-41cd-960e-2c9f43f30da6","Type":"ContainerStarted","Data":"5f0beb983090c3fb7bdc5c8867da24919fe6a7185936f6b11d9f995247cd35c5"} Nov 28 13:41:10 crc kubenswrapper[4970]: I1128 13:41:10.454783 4970 generic.go:334] "Generic (PLEG): container finished" podID="02bf0f86-96f4-41cd-960e-2c9f43f30da6" containerID="ce6c74640208dae6d3e76db85346f9d929887c92a252876e5f8c0da4a7f8d373" exitCode=0 Nov 28 13:41:10 crc kubenswrapper[4970]: I1128 13:41:10.454915 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" event={"ID":"02bf0f86-96f4-41cd-960e-2c9f43f30da6","Type":"ContainerDied","Data":"ce6c74640208dae6d3e76db85346f9d929887c92a252876e5f8c0da4a7f8d373"} Nov 28 13:41:11 crc kubenswrapper[4970]: I1128 13:41:11.924189 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.068309 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bf0f86-96f4-41cd-960e-2c9f43f30da6-operator-scripts\") pod \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.068422 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnn7t\" (UniqueName: \"kubernetes.io/projected/02bf0f86-96f4-41cd-960e-2c9f43f30da6-kube-api-access-bnn7t\") pod \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\" (UID: \"02bf0f86-96f4-41cd-960e-2c9f43f30da6\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.070907 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bf0f86-96f4-41cd-960e-2c9f43f30da6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02bf0f86-96f4-41cd-960e-2c9f43f30da6" (UID: "02bf0f86-96f4-41cd-960e-2c9f43f30da6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.075398 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bf0f86-96f4-41cd-960e-2c9f43f30da6-kube-api-access-bnn7t" (OuterVolumeSpecName: "kube-api-access-bnn7t") pod "02bf0f86-96f4-41cd-960e-2c9f43f30da6" (UID: "02bf0f86-96f4-41cd-960e-2c9f43f30da6"). InnerVolumeSpecName "kube-api-access-bnn7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.150189 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.169489 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnn7t\" (UniqueName: \"kubernetes.io/projected/02bf0f86-96f4-41cd-960e-2c9f43f30da6-kube-api-access-bnn7t\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.169511 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bf0f86-96f4-41cd-960e-2c9f43f30da6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.270507 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-credential-keys\") pod \"79e80fee-34f1-48bf-884d-4fd32b593378\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.270564 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-config-data\") pod \"79e80fee-34f1-48bf-884d-4fd32b593378\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.270583 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-scripts\") pod \"79e80fee-34f1-48bf-884d-4fd32b593378\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.270616 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-fernet-keys\") pod \"79e80fee-34f1-48bf-884d-4fd32b593378\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.270676 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgzcv\" (UniqueName: \"kubernetes.io/projected/79e80fee-34f1-48bf-884d-4fd32b593378-kube-api-access-fgzcv\") pod \"79e80fee-34f1-48bf-884d-4fd32b593378\" (UID: \"79e80fee-34f1-48bf-884d-4fd32b593378\") " Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.274930 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-scripts" (OuterVolumeSpecName: "scripts") pod "79e80fee-34f1-48bf-884d-4fd32b593378" (UID: "79e80fee-34f1-48bf-884d-4fd32b593378"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.274975 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "79e80fee-34f1-48bf-884d-4fd32b593378" (UID: "79e80fee-34f1-48bf-884d-4fd32b593378"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.274993 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "79e80fee-34f1-48bf-884d-4fd32b593378" (UID: "79e80fee-34f1-48bf-884d-4fd32b593378"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.276618 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e80fee-34f1-48bf-884d-4fd32b593378-kube-api-access-fgzcv" (OuterVolumeSpecName: "kube-api-access-fgzcv") pod "79e80fee-34f1-48bf-884d-4fd32b593378" (UID: "79e80fee-34f1-48bf-884d-4fd32b593378"). InnerVolumeSpecName "kube-api-access-fgzcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.290976 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-config-data" (OuterVolumeSpecName: "config-data") pod "79e80fee-34f1-48bf-884d-4fd32b593378" (UID: "79e80fee-34f1-48bf-884d-4fd32b593378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.372636 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgzcv\" (UniqueName: \"kubernetes.io/projected/79e80fee-34f1-48bf-884d-4fd32b593378-kube-api-access-fgzcv\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.372683 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.372694 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.372704 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.372713 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e80fee-34f1-48bf-884d-4fd32b593378-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.482771 4970 generic.go:334] "Generic (PLEG): container finished" podID="79e80fee-34f1-48bf-884d-4fd32b593378" containerID="322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc" exitCode=0 Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.482971 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" event={"ID":"79e80fee-34f1-48bf-884d-4fd32b593378","Type":"ContainerDied","Data":"322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc"} Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.483054 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" event={"ID":"79e80fee-34f1-48bf-884d-4fd32b593378","Type":"ContainerDied","Data":"d01750a1c02e468ce55bdde2191d7d056715f08b8872e97a195c26ca1522c1f9"} Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.483116 4970 scope.go:117] "RemoveContainer" containerID="322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.483302 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.490840 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" event={"ID":"02bf0f86-96f4-41cd-960e-2c9f43f30da6","Type":"ContainerDied","Data":"5f0beb983090c3fb7bdc5c8867da24919fe6a7185936f6b11d9f995247cd35c5"} Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.491022 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f0beb983090c3fb7bdc5c8867da24919fe6a7185936f6b11d9f995247cd35c5" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.491193 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5656-account-delete-sbckc" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.527750 4970 scope.go:117] "RemoveContainer" containerID="322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc" Nov 28 13:41:12 crc kubenswrapper[4970]: E1128 13:41:12.528383 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc\": container with ID starting with 322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc not found: ID does not exist" containerID="322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.528454 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc"} err="failed to get container status \"322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc\": rpc error: code = NotFound desc = could not find container \"322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc\": container with ID starting with 322e7511dfe863cb5bc88e33b112cdb74ca29bfa1b285e988fdd629f4298d8cc not found: ID does not exist" Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.537256 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt"] Nov 28 13:41:12 crc kubenswrapper[4970]: I1128 13:41:12.542882 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-d6cfdb8b6-jnkxt"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.394363 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e80fee-34f1-48bf-884d-4fd32b593378" path="/var/lib/kubelet/pods/79e80fee-34f1-48bf-884d-4fd32b593378/volumes" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.658148 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-klrz9"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.670855 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-klrz9"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.683127 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone5656-account-delete-sbckc"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.692947 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5656-account-create-update-hwplx"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.698237 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone5656-account-delete-sbckc"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.703265 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5656-account-create-update-hwplx"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.729785 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8m9wz"] Nov 28 13:41:13 crc kubenswrapper[4970]: E1128 13:41:13.730057 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e80fee-34f1-48bf-884d-4fd32b593378" containerName="keystone-api" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.730088 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e80fee-34f1-48bf-884d-4fd32b593378" containerName="keystone-api" Nov 28 13:41:13 crc kubenswrapper[4970]: E1128 13:41:13.730102 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bf0f86-96f4-41cd-960e-2c9f43f30da6" containerName="mariadb-account-delete" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.730108 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bf0f86-96f4-41cd-960e-2c9f43f30da6" containerName="mariadb-account-delete" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.730235 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e80fee-34f1-48bf-884d-4fd32b593378" containerName="keystone-api" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.730250 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bf0f86-96f4-41cd-960e-2c9f43f30da6" containerName="mariadb-account-delete" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.730634 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.772574 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8m9wz"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.794900 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0015a1f-b332-4d23-9c85-83fce1551460-operator-scripts\") pod \"keystone-db-create-8m9wz\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.795201 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrpc\" (UniqueName: \"kubernetes.io/projected/a0015a1f-b332-4d23-9c85-83fce1551460-kube-api-access-jkrpc\") pod \"keystone-db-create-8m9wz\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.843703 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.844874 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.847307 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.855872 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q"] Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.897142 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0015a1f-b332-4d23-9c85-83fce1551460-operator-scripts\") pod \"keystone-db-create-8m9wz\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.897269 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrpc\" (UniqueName: \"kubernetes.io/projected/a0015a1f-b332-4d23-9c85-83fce1551460-kube-api-access-jkrpc\") pod \"keystone-db-create-8m9wz\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.897941 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0015a1f-b332-4d23-9c85-83fce1551460-operator-scripts\") pod \"keystone-db-create-8m9wz\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.920024 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrpc\" (UniqueName: \"kubernetes.io/projected/a0015a1f-b332-4d23-9c85-83fce1551460-kube-api-access-jkrpc\") pod \"keystone-db-create-8m9wz\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.998929 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-operator-scripts\") pod \"keystone-85fc-account-create-update-dth5q\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:13 crc kubenswrapper[4970]: I1128 13:41:13.999649 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48xq\" (UniqueName: \"kubernetes.io/projected/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-kube-api-access-x48xq\") pod \"keystone-85fc-account-create-update-dth5q\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.054669 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.101365 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-operator-scripts\") pod \"keystone-85fc-account-create-update-dth5q\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.101447 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48xq\" (UniqueName: \"kubernetes.io/projected/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-kube-api-access-x48xq\") pod \"keystone-85fc-account-create-update-dth5q\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.102611 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-operator-scripts\") pod \"keystone-85fc-account-create-update-dth5q\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.120868 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48xq\" (UniqueName: \"kubernetes.io/projected/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-kube-api-access-x48xq\") pod \"keystone-85fc-account-create-update-dth5q\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.170616 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:14 crc kubenswrapper[4970]: W1128 13:41:14.484531 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0015a1f_b332_4d23_9c85_83fce1551460.slice/crio-abfdfb85214f7f3daa021c6db6b9bc1a131f444e4ec344c5c3ae36e0fb8a90fc WatchSource:0}: Error finding container abfdfb85214f7f3daa021c6db6b9bc1a131f444e4ec344c5c3ae36e0fb8a90fc: Status 404 returned error can't find the container with id abfdfb85214f7f3daa021c6db6b9bc1a131f444e4ec344c5c3ae36e0fb8a90fc Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.486084 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8m9wz"] Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.513826 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" event={"ID":"a0015a1f-b332-4d23-9c85-83fce1551460","Type":"ContainerStarted","Data":"abfdfb85214f7f3daa021c6db6b9bc1a131f444e4ec344c5c3ae36e0fb8a90fc"} Nov 28 13:41:14 crc kubenswrapper[4970]: I1128 13:41:14.587947 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q"] Nov 28 13:41:14 crc kubenswrapper[4970]: W1128 13:41:14.596732 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd97c0e2d_5899_4b03_94ef_d0ade8964b2d.slice/crio-3458e8c4b57de773d89b64eb8a2d2bfde14735076abef466a688aab504113c35 WatchSource:0}: Error finding container 3458e8c4b57de773d89b64eb8a2d2bfde14735076abef466a688aab504113c35: Status 404 returned error can't find the container with id 3458e8c4b57de773d89b64eb8a2d2bfde14735076abef466a688aab504113c35 Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.392179 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02171036-5aa9-4a47-9469-5faa9a495e23" path="/var/lib/kubelet/pods/02171036-5aa9-4a47-9469-5faa9a495e23/volumes" Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.393752 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bf0f86-96f4-41cd-960e-2c9f43f30da6" path="/var/lib/kubelet/pods/02bf0f86-96f4-41cd-960e-2c9f43f30da6/volumes" Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.394731 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9" path="/var/lib/kubelet/pods/a4e8dd6a-a78b-4f89-bab5-20fd7e8da6a9/volumes" Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.525729 4970 generic.go:334] "Generic (PLEG): container finished" podID="d97c0e2d-5899-4b03-94ef-d0ade8964b2d" containerID="d995d3c0dd322a8417043e4c613395fe5c7c2e6637d3313257ef13a9ec9b05a2" exitCode=0 Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.525823 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" event={"ID":"d97c0e2d-5899-4b03-94ef-d0ade8964b2d","Type":"ContainerDied","Data":"d995d3c0dd322a8417043e4c613395fe5c7c2e6637d3313257ef13a9ec9b05a2"} Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.525870 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" event={"ID":"d97c0e2d-5899-4b03-94ef-d0ade8964b2d","Type":"ContainerStarted","Data":"3458e8c4b57de773d89b64eb8a2d2bfde14735076abef466a688aab504113c35"} Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.527485 4970 generic.go:334] "Generic (PLEG): container finished" podID="a0015a1f-b332-4d23-9c85-83fce1551460" containerID="1c27b6736a74509268beb39adc22f4d98ca67cf21927baba449a6aa64b66a4ca" exitCode=0 Nov 28 13:41:15 crc kubenswrapper[4970]: I1128 13:41:15.527518 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" event={"ID":"a0015a1f-b332-4d23-9c85-83fce1551460","Type":"ContainerDied","Data":"1c27b6736a74509268beb39adc22f4d98ca67cf21927baba449a6aa64b66a4ca"} Nov 28 13:41:16 crc kubenswrapper[4970]: I1128 13:41:16.901754 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:16 crc kubenswrapper[4970]: I1128 13:41:16.906751 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.049885 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0015a1f-b332-4d23-9c85-83fce1551460-operator-scripts\") pod \"a0015a1f-b332-4d23-9c85-83fce1551460\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.049921 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkrpc\" (UniqueName: \"kubernetes.io/projected/a0015a1f-b332-4d23-9c85-83fce1551460-kube-api-access-jkrpc\") pod \"a0015a1f-b332-4d23-9c85-83fce1551460\" (UID: \"a0015a1f-b332-4d23-9c85-83fce1551460\") " Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.049959 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48xq\" (UniqueName: \"kubernetes.io/projected/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-kube-api-access-x48xq\") pod \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.050044 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-operator-scripts\") pod \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\" (UID: \"d97c0e2d-5899-4b03-94ef-d0ade8964b2d\") " Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.050406 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0015a1f-b332-4d23-9c85-83fce1551460-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0015a1f-b332-4d23-9c85-83fce1551460" (UID: "a0015a1f-b332-4d23-9c85-83fce1551460"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.050802 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d97c0e2d-5899-4b03-94ef-d0ade8964b2d" (UID: "d97c0e2d-5899-4b03-94ef-d0ade8964b2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.055932 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-kube-api-access-x48xq" (OuterVolumeSpecName: "kube-api-access-x48xq") pod "d97c0e2d-5899-4b03-94ef-d0ade8964b2d" (UID: "d97c0e2d-5899-4b03-94ef-d0ade8964b2d"). InnerVolumeSpecName "kube-api-access-x48xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.057478 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0015a1f-b332-4d23-9c85-83fce1551460-kube-api-access-jkrpc" (OuterVolumeSpecName: "kube-api-access-jkrpc") pod "a0015a1f-b332-4d23-9c85-83fce1551460" (UID: "a0015a1f-b332-4d23-9c85-83fce1551460"). InnerVolumeSpecName "kube-api-access-jkrpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.151814 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48xq\" (UniqueName: \"kubernetes.io/projected/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-kube-api-access-x48xq\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.151871 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d97c0e2d-5899-4b03-94ef-d0ade8964b2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.151891 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkrpc\" (UniqueName: \"kubernetes.io/projected/a0015a1f-b332-4d23-9c85-83fce1551460-kube-api-access-jkrpc\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.151910 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0015a1f-b332-4d23-9c85-83fce1551460-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.548593 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" event={"ID":"d97c0e2d-5899-4b03-94ef-d0ade8964b2d","Type":"ContainerDied","Data":"3458e8c4b57de773d89b64eb8a2d2bfde14735076abef466a688aab504113c35"} Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.548633 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.548641 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3458e8c4b57de773d89b64eb8a2d2bfde14735076abef466a688aab504113c35" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.550468 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" event={"ID":"a0015a1f-b332-4d23-9c85-83fce1551460","Type":"ContainerDied","Data":"abfdfb85214f7f3daa021c6db6b9bc1a131f444e4ec344c5c3ae36e0fb8a90fc"} Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.550489 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abfdfb85214f7f3daa021c6db6b9bc1a131f444e4ec344c5c3ae36e0fb8a90fc" Nov 28 13:41:17 crc kubenswrapper[4970]: I1128 13:41:17.550539 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-8m9wz" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.435602 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-b9wh9"] Nov 28 13:41:19 crc kubenswrapper[4970]: E1128 13:41:19.436485 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97c0e2d-5899-4b03-94ef-d0ade8964b2d" containerName="mariadb-account-create-update" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.436505 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97c0e2d-5899-4b03-94ef-d0ade8964b2d" containerName="mariadb-account-create-update" Nov 28 13:41:19 crc kubenswrapper[4970]: E1128 13:41:19.436528 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0015a1f-b332-4d23-9c85-83fce1551460" containerName="mariadb-database-create" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.436536 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0015a1f-b332-4d23-9c85-83fce1551460" containerName="mariadb-database-create" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.436698 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0015a1f-b332-4d23-9c85-83fce1551460" containerName="mariadb-database-create" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.436717 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97c0e2d-5899-4b03-94ef-d0ade8964b2d" containerName="mariadb-account-create-update" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.438164 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.441136 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-qxxq5" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.441362 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.441543 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.441998 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.453074 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-b9wh9"] Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.486279 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24bb341-be92-44c8-bcea-d15bcb539a26-config-data\") pod \"keystone-db-sync-b9wh9\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.486359 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph4p4\" (UniqueName: \"kubernetes.io/projected/c24bb341-be92-44c8-bcea-d15bcb539a26-kube-api-access-ph4p4\") pod \"keystone-db-sync-b9wh9\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.588401 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24bb341-be92-44c8-bcea-d15bcb539a26-config-data\") pod \"keystone-db-sync-b9wh9\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.588536 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph4p4\" (UniqueName: \"kubernetes.io/projected/c24bb341-be92-44c8-bcea-d15bcb539a26-kube-api-access-ph4p4\") pod \"keystone-db-sync-b9wh9\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.595420 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24bb341-be92-44c8-bcea-d15bcb539a26-config-data\") pod \"keystone-db-sync-b9wh9\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.613001 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph4p4\" (UniqueName: \"kubernetes.io/projected/c24bb341-be92-44c8-bcea-d15bcb539a26-kube-api-access-ph4p4\") pod \"keystone-db-sync-b9wh9\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:19 crc kubenswrapper[4970]: I1128 13:41:19.769858 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:20 crc kubenswrapper[4970]: I1128 13:41:20.728593 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-b9wh9"] Nov 28 13:41:21 crc kubenswrapper[4970]: I1128 13:41:21.333713 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:41:21 crc kubenswrapper[4970]: I1128 13:41:21.334165 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:41:21 crc kubenswrapper[4970]: I1128 13:41:21.591791 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" event={"ID":"c24bb341-be92-44c8-bcea-d15bcb539a26","Type":"ContainerStarted","Data":"f7241326c4b9209487d1844c103cedd43712580a7e4ee2a4627478f5efd866ed"} Nov 28 13:41:21 crc kubenswrapper[4970]: I1128 13:41:21.592081 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" event={"ID":"c24bb341-be92-44c8-bcea-d15bcb539a26","Type":"ContainerStarted","Data":"96bedcb9fc5207ac41393792bd984301d2f88a972c01aeccec23c1d0b9cbcbac"} Nov 28 13:41:21 crc kubenswrapper[4970]: I1128 13:41:21.609483 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" podStartSLOduration=2.609455566 podStartE2EDuration="2.609455566s" podCreationTimestamp="2025-11-28 13:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:41:21.60851943 +0000 UTC m=+1292.461401250" watchObservedRunningTime="2025-11-28 13:41:21.609455566 +0000 UTC m=+1292.462337416" Nov 28 13:41:22 crc kubenswrapper[4970]: I1128 13:41:22.603824 4970 generic.go:334] "Generic (PLEG): container finished" podID="c24bb341-be92-44c8-bcea-d15bcb539a26" containerID="f7241326c4b9209487d1844c103cedd43712580a7e4ee2a4627478f5efd866ed" exitCode=0 Nov 28 13:41:22 crc kubenswrapper[4970]: I1128 13:41:22.603958 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" event={"ID":"c24bb341-be92-44c8-bcea-d15bcb539a26","Type":"ContainerDied","Data":"f7241326c4b9209487d1844c103cedd43712580a7e4ee2a4627478f5efd866ed"} Nov 28 13:41:23 crc kubenswrapper[4970]: I1128 13:41:23.960291 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.083952 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph4p4\" (UniqueName: \"kubernetes.io/projected/c24bb341-be92-44c8-bcea-d15bcb539a26-kube-api-access-ph4p4\") pod \"c24bb341-be92-44c8-bcea-d15bcb539a26\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.084044 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24bb341-be92-44c8-bcea-d15bcb539a26-config-data\") pod \"c24bb341-be92-44c8-bcea-d15bcb539a26\" (UID: \"c24bb341-be92-44c8-bcea-d15bcb539a26\") " Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.091437 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24bb341-be92-44c8-bcea-d15bcb539a26-kube-api-access-ph4p4" (OuterVolumeSpecName: "kube-api-access-ph4p4") pod "c24bb341-be92-44c8-bcea-d15bcb539a26" (UID: "c24bb341-be92-44c8-bcea-d15bcb539a26"). InnerVolumeSpecName "kube-api-access-ph4p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.116935 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24bb341-be92-44c8-bcea-d15bcb539a26-config-data" (OuterVolumeSpecName: "config-data") pod "c24bb341-be92-44c8-bcea-d15bcb539a26" (UID: "c24bb341-be92-44c8-bcea-d15bcb539a26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.185330 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph4p4\" (UniqueName: \"kubernetes.io/projected/c24bb341-be92-44c8-bcea-d15bcb539a26-kube-api-access-ph4p4\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.185369 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c24bb341-be92-44c8-bcea-d15bcb539a26-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.625467 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" event={"ID":"c24bb341-be92-44c8-bcea-d15bcb539a26","Type":"ContainerDied","Data":"96bedcb9fc5207ac41393792bd984301d2f88a972c01aeccec23c1d0b9cbcbac"} Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.625522 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96bedcb9fc5207ac41393792bd984301d2f88a972c01aeccec23c1d0b9cbcbac" Nov 28 13:41:24 crc kubenswrapper[4970]: I1128 13:41:24.625590 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-b9wh9" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.184876 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kc54s"] Nov 28 13:41:25 crc kubenswrapper[4970]: E1128 13:41:25.185850 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24bb341-be92-44c8-bcea-d15bcb539a26" containerName="keystone-db-sync" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.185875 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24bb341-be92-44c8-bcea-d15bcb539a26" containerName="keystone-db-sync" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.186120 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24bb341-be92-44c8-bcea-d15bcb539a26" containerName="keystone-db-sync" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.187118 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.190339 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.190459 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.190541 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.190636 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-qxxq5" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.193135 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.212631 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kc54s"] Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.301661 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-config-data\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.301766 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmnjb\" (UniqueName: \"kubernetes.io/projected/d22c4593-6019-4a0c-9ca3-2da7907946be-kube-api-access-gmnjb\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.301824 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-scripts\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.301856 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-fernet-keys\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.301921 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-credential-keys\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.403022 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-config-data\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.403160 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmnjb\" (UniqueName: \"kubernetes.io/projected/d22c4593-6019-4a0c-9ca3-2da7907946be-kube-api-access-gmnjb\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.403253 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-scripts\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.403302 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-fernet-keys\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.403362 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-credential-keys\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.415160 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-scripts\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.415208 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-fernet-keys\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.415366 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-config-data\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.419090 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-credential-keys\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.439125 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmnjb\" (UniqueName: \"kubernetes.io/projected/d22c4593-6019-4a0c-9ca3-2da7907946be-kube-api-access-gmnjb\") pod \"keystone-bootstrap-kc54s\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.503448 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:25 crc kubenswrapper[4970]: I1128 13:41:25.805579 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kc54s"] Nov 28 13:41:25 crc kubenswrapper[4970]: W1128 13:41:25.812679 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd22c4593_6019_4a0c_9ca3_2da7907946be.slice/crio-eec31d479638396367e654976374fe9dc2f8b7f09b0945a9e6e5b339ffc07210 WatchSource:0}: Error finding container eec31d479638396367e654976374fe9dc2f8b7f09b0945a9e6e5b339ffc07210: Status 404 returned error can't find the container with id eec31d479638396367e654976374fe9dc2f8b7f09b0945a9e6e5b339ffc07210 Nov 28 13:41:26 crc kubenswrapper[4970]: I1128 13:41:26.649796 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" event={"ID":"d22c4593-6019-4a0c-9ca3-2da7907946be","Type":"ContainerStarted","Data":"b4a1d256477224d248ffc936d1c5d9aca1aaae06c142ef33adec4c8996aa215b"} Nov 28 13:41:26 crc kubenswrapper[4970]: I1128 13:41:26.651776 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" event={"ID":"d22c4593-6019-4a0c-9ca3-2da7907946be","Type":"ContainerStarted","Data":"eec31d479638396367e654976374fe9dc2f8b7f09b0945a9e6e5b339ffc07210"} Nov 28 13:41:26 crc kubenswrapper[4970]: I1128 13:41:26.685821 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" podStartSLOduration=1.685797269 podStartE2EDuration="1.685797269s" podCreationTimestamp="2025-11-28 13:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:41:26.679169662 +0000 UTC m=+1297.532051492" watchObservedRunningTime="2025-11-28 13:41:26.685797269 +0000 UTC m=+1297.538679109" Nov 28 13:41:28 crc kubenswrapper[4970]: I1128 13:41:28.673495 4970 generic.go:334] "Generic (PLEG): container finished" podID="d22c4593-6019-4a0c-9ca3-2da7907946be" containerID="b4a1d256477224d248ffc936d1c5d9aca1aaae06c142ef33adec4c8996aa215b" exitCode=0 Nov 28 13:41:28 crc kubenswrapper[4970]: I1128 13:41:28.673597 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" event={"ID":"d22c4593-6019-4a0c-9ca3-2da7907946be","Type":"ContainerDied","Data":"b4a1d256477224d248ffc936d1c5d9aca1aaae06c142ef33adec4c8996aa215b"} Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.005547 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.083524 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-scripts\") pod \"d22c4593-6019-4a0c-9ca3-2da7907946be\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.083776 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-fernet-keys\") pod \"d22c4593-6019-4a0c-9ca3-2da7907946be\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.083822 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-config-data\") pod \"d22c4593-6019-4a0c-9ca3-2da7907946be\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.083900 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-credential-keys\") pod \"d22c4593-6019-4a0c-9ca3-2da7907946be\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.084688 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmnjb\" (UniqueName: \"kubernetes.io/projected/d22c4593-6019-4a0c-9ca3-2da7907946be-kube-api-access-gmnjb\") pod \"d22c4593-6019-4a0c-9ca3-2da7907946be\" (UID: \"d22c4593-6019-4a0c-9ca3-2da7907946be\") " Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.089944 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-scripts" (OuterVolumeSpecName: "scripts") pod "d22c4593-6019-4a0c-9ca3-2da7907946be" (UID: "d22c4593-6019-4a0c-9ca3-2da7907946be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.090570 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22c4593-6019-4a0c-9ca3-2da7907946be-kube-api-access-gmnjb" (OuterVolumeSpecName: "kube-api-access-gmnjb") pod "d22c4593-6019-4a0c-9ca3-2da7907946be" (UID: "d22c4593-6019-4a0c-9ca3-2da7907946be"). InnerVolumeSpecName "kube-api-access-gmnjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.090804 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d22c4593-6019-4a0c-9ca3-2da7907946be" (UID: "d22c4593-6019-4a0c-9ca3-2da7907946be"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.097329 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d22c4593-6019-4a0c-9ca3-2da7907946be" (UID: "d22c4593-6019-4a0c-9ca3-2da7907946be"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.103763 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-config-data" (OuterVolumeSpecName: "config-data") pod "d22c4593-6019-4a0c-9ca3-2da7907946be" (UID: "d22c4593-6019-4a0c-9ca3-2da7907946be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.186619 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.186658 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.186669 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.186681 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmnjb\" (UniqueName: \"kubernetes.io/projected/d22c4593-6019-4a0c-9ca3-2da7907946be-kube-api-access-gmnjb\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.186690 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d22c4593-6019-4a0c-9ca3-2da7907946be-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.690691 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" event={"ID":"d22c4593-6019-4a0c-9ca3-2da7907946be","Type":"ContainerDied","Data":"eec31d479638396367e654976374fe9dc2f8b7f09b0945a9e6e5b339ffc07210"} Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.690745 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eec31d479638396367e654976374fe9dc2f8b7f09b0945a9e6e5b339ffc07210" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.690800 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-kc54s" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.786885 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7f84979c99-7g2v7"] Nov 28 13:41:30 crc kubenswrapper[4970]: E1128 13:41:30.787210 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22c4593-6019-4a0c-9ca3-2da7907946be" containerName="keystone-bootstrap" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.787256 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22c4593-6019-4a0c-9ca3-2da7907946be" containerName="keystone-bootstrap" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.787444 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22c4593-6019-4a0c-9ca3-2da7907946be" containerName="keystone-bootstrap" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.788042 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.789900 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.790416 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-qxxq5" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.791746 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.792192 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.801416 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7f84979c99-7g2v7"] Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.896674 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-config-data\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.896754 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-credential-keys\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.896886 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcvp\" (UniqueName: \"kubernetes.io/projected/9232407d-077b-4ed6-9350-ee386d73677d-kube-api-access-6lcvp\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.896946 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-fernet-keys\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.897135 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-scripts\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.998133 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-config-data\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.998204 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-credential-keys\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.998274 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcvp\" (UniqueName: \"kubernetes.io/projected/9232407d-077b-4ed6-9350-ee386d73677d-kube-api-access-6lcvp\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.998323 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-fernet-keys\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:30 crc kubenswrapper[4970]: I1128 13:41:30.998417 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-scripts\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.002486 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-fernet-keys\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.002533 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-credential-keys\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.003059 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-scripts\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.003902 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-config-data\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.025816 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcvp\" (UniqueName: \"kubernetes.io/projected/9232407d-077b-4ed6-9350-ee386d73677d-kube-api-access-6lcvp\") pod \"keystone-7f84979c99-7g2v7\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.125282 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.359688 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7f84979c99-7g2v7"] Nov 28 13:41:31 crc kubenswrapper[4970]: W1128 13:41:31.368840 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9232407d_077b_4ed6_9350_ee386d73677d.slice/crio-06b2d2fafa3a2f6b6af8874eafc901c3d588e1bb3ea4adf8c68026dbc8fdf693 WatchSource:0}: Error finding container 06b2d2fafa3a2f6b6af8874eafc901c3d588e1bb3ea4adf8c68026dbc8fdf693: Status 404 returned error can't find the container with id 06b2d2fafa3a2f6b6af8874eafc901c3d588e1bb3ea4adf8c68026dbc8fdf693 Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.700657 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" event={"ID":"9232407d-077b-4ed6-9350-ee386d73677d","Type":"ContainerStarted","Data":"bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d"} Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.700999 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" event={"ID":"9232407d-077b-4ed6-9350-ee386d73677d","Type":"ContainerStarted","Data":"06b2d2fafa3a2f6b6af8874eafc901c3d588e1bb3ea4adf8c68026dbc8fdf693"} Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.701019 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:41:31 crc kubenswrapper[4970]: I1128 13:41:31.728163 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" podStartSLOduration=1.728139375 podStartE2EDuration="1.728139375s" podCreationTimestamp="2025-11-28 13:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:41:31.720278564 +0000 UTC m=+1302.573160374" watchObservedRunningTime="2025-11-28 13:41:31.728139375 +0000 UTC m=+1302.581021215" Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.333573 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.334396 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.334467 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.335503 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b2419e8c0d194a5a29f4c079224199be6f97c77dca32e7413992a8fbfc0b4d2"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.335587 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://3b2419e8c0d194a5a29f4c079224199be6f97c77dca32e7413992a8fbfc0b4d2" gracePeriod=600 Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.914373 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="3b2419e8c0d194a5a29f4c079224199be6f97c77dca32e7413992a8fbfc0b4d2" exitCode=0 Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.914433 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"3b2419e8c0d194a5a29f4c079224199be6f97c77dca32e7413992a8fbfc0b4d2"} Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.914769 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50"} Nov 28 13:41:51 crc kubenswrapper[4970]: I1128 13:41:51.914805 4970 scope.go:117] "RemoveContainer" containerID="cd30743a39e211613c4a030816a122a40e3a3cc19bf445b31c5fe37b451ef30e" Nov 28 13:42:02 crc kubenswrapper[4970]: I1128 13:42:02.506777 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.852519 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.854722 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.857042 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-txnh2" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.858597 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.861353 4970 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.863309 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.968490 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.968706 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6htt\" (UniqueName: \"kubernetes.io/projected/4712809e-99af-47ba-84d0-085f3b07f326-kube-api-access-c6htt\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:03 crc kubenswrapper[4970]: I1128 13:42:03.968773 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config-secret\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.070869 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6htt\" (UniqueName: \"kubernetes.io/projected/4712809e-99af-47ba-84d0-085f3b07f326-kube-api-access-c6htt\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.070990 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config-secret\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.071083 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.073592 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.082501 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config-secret\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.101463 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6htt\" (UniqueName: \"kubernetes.io/projected/4712809e-99af-47ba-84d0-085f3b07f326-kube-api-access-c6htt\") pod \"openstackclient\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.190908 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.486926 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:42:04 crc kubenswrapper[4970]: I1128 13:42:04.502866 4970 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:42:05 crc kubenswrapper[4970]: I1128 13:42:05.045572 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4712809e-99af-47ba-84d0-085f3b07f326","Type":"ContainerStarted","Data":"3067979218a3b1a4aa085d5831a366af93218516ffa586b7c8339599a1b99c19"} Nov 28 13:42:13 crc kubenswrapper[4970]: I1128 13:42:13.109411 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4712809e-99af-47ba-84d0-085f3b07f326","Type":"ContainerStarted","Data":"7096b7c56697c221da2868892e97f77650355123a3a886f93cfdc446b2c286b3"} Nov 28 13:42:13 crc kubenswrapper[4970]: I1128 13:42:13.139287 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstackclient" podStartSLOduration=2.222822704 podStartE2EDuration="10.139262124s" podCreationTimestamp="2025-11-28 13:42:03 +0000 UTC" firstStartedPulling="2025-11-28 13:42:04.502533092 +0000 UTC m=+1335.355414892" lastFinishedPulling="2025-11-28 13:42:12.418972522 +0000 UTC m=+1343.271854312" observedRunningTime="2025-11-28 13:42:13.132145623 +0000 UTC m=+1343.985027453" watchObservedRunningTime="2025-11-28 13:42:13.139262124 +0000 UTC m=+1343.992143964" Nov 28 13:42:50 crc kubenswrapper[4970]: I1128 13:42:50.360176 4970 scope.go:117] "RemoveContainer" containerID="b98972926a03a7c3de9c1bcabca73c5fd08a14299da372458a680f241a8dfeb1" Nov 28 13:42:50 crc kubenswrapper[4970]: I1128 13:42:50.423125 4970 scope.go:117] "RemoveContainer" containerID="442ae51f9a087ea62429a9eb102060380b7990952f01041b58a082dcba885912" Nov 28 13:42:50 crc kubenswrapper[4970]: I1128 13:42:50.457890 4970 scope.go:117] "RemoveContainer" containerID="6f8435a264f23b7d579e74f5f19a57cba7471b2418c2a6478d30533bbd6f0cb3" Nov 28 13:42:50 crc kubenswrapper[4970]: I1128 13:42:50.518342 4970 scope.go:117] "RemoveContainer" containerID="daf7dd6f5a9566952e5e4beb78265791a07df0c0a0f1cd3b9a019aed65d7c030" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.481303 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j964m"] Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.484750 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.501394 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j964m"] Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.597938 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdl9b\" (UniqueName: \"kubernetes.io/projected/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-kube-api-access-wdl9b\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.598102 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-catalog-content\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.598158 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-utilities\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.699692 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdl9b\" (UniqueName: \"kubernetes.io/projected/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-kube-api-access-wdl9b\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.699802 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-catalog-content\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.699841 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-utilities\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.700503 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-utilities\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.700661 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-catalog-content\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.742894 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdl9b\" (UniqueName: \"kubernetes.io/projected/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-kube-api-access-wdl9b\") pod \"redhat-operators-j964m\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:07 crc kubenswrapper[4970]: I1128 13:43:07.816042 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:08 crc kubenswrapper[4970]: I1128 13:43:08.298447 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j964m"] Nov 28 13:43:08 crc kubenswrapper[4970]: I1128 13:43:08.625122 4970 generic.go:334] "Generic (PLEG): container finished" podID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerID="3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758" exitCode=0 Nov 28 13:43:08 crc kubenswrapper[4970]: I1128 13:43:08.625184 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerDied","Data":"3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758"} Nov 28 13:43:08 crc kubenswrapper[4970]: I1128 13:43:08.625503 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerStarted","Data":"41dcc35fb9ff92753ed537a5d327e63f993b27c802ef7a7b9da500e75b1103ea"} Nov 28 13:43:09 crc kubenswrapper[4970]: I1128 13:43:09.680593 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerStarted","Data":"aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f"} Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.653319 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wj8h4"] Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.656389 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.677720 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wj8h4"] Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.695657 4970 generic.go:334] "Generic (PLEG): container finished" podID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerID="aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f" exitCode=0 Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.695739 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerDied","Data":"aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f"} Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.842348 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrxq\" (UniqueName: \"kubernetes.io/projected/74c05369-8de7-4fde-a2e1-fd8a7484d151-kube-api-access-7xrxq\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.842745 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-catalog-content\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.842830 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-utilities\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.948187 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrxq\" (UniqueName: \"kubernetes.io/projected/74c05369-8de7-4fde-a2e1-fd8a7484d151-kube-api-access-7xrxq\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.948983 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-catalog-content\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.949145 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-utilities\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.949504 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-catalog-content\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.949580 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-utilities\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.984488 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrxq\" (UniqueName: \"kubernetes.io/projected/74c05369-8de7-4fde-a2e1-fd8a7484d151-kube-api-access-7xrxq\") pod \"community-operators-wj8h4\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:10 crc kubenswrapper[4970]: I1128 13:43:10.991195 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:11 crc kubenswrapper[4970]: I1128 13:43:11.489140 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wj8h4"] Nov 28 13:43:11 crc kubenswrapper[4970]: W1128 13:43:11.491757 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c05369_8de7_4fde_a2e1_fd8a7484d151.slice/crio-05563f61ac2b8b985f4a2bde813c5bb601b5d1dfea7a22fb1996f3f35734af79 WatchSource:0}: Error finding container 05563f61ac2b8b985f4a2bde813c5bb601b5d1dfea7a22fb1996f3f35734af79: Status 404 returned error can't find the container with id 05563f61ac2b8b985f4a2bde813c5bb601b5d1dfea7a22fb1996f3f35734af79 Nov 28 13:43:11 crc kubenswrapper[4970]: I1128 13:43:11.707879 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerStarted","Data":"c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb"} Nov 28 13:43:11 crc kubenswrapper[4970]: I1128 13:43:11.714411 4970 generic.go:334] "Generic (PLEG): container finished" podID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerID="4304f24a356d127074fad7c7327136a59a5a2663d473b2d385705ab4e35659a5" exitCode=0 Nov 28 13:43:11 crc kubenswrapper[4970]: I1128 13:43:11.714466 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerDied","Data":"4304f24a356d127074fad7c7327136a59a5a2663d473b2d385705ab4e35659a5"} Nov 28 13:43:11 crc kubenswrapper[4970]: I1128 13:43:11.714495 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerStarted","Data":"05563f61ac2b8b985f4a2bde813c5bb601b5d1dfea7a22fb1996f3f35734af79"} Nov 28 13:43:11 crc kubenswrapper[4970]: I1128 13:43:11.731603 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j964m" podStartSLOduration=2.047543869 podStartE2EDuration="4.731583393s" podCreationTimestamp="2025-11-28 13:43:07 +0000 UTC" firstStartedPulling="2025-11-28 13:43:08.626679416 +0000 UTC m=+1399.479561226" lastFinishedPulling="2025-11-28 13:43:11.31071895 +0000 UTC m=+1402.163600750" observedRunningTime="2025-11-28 13:43:11.72610989 +0000 UTC m=+1402.578991720" watchObservedRunningTime="2025-11-28 13:43:11.731583393 +0000 UTC m=+1402.584465193" Nov 28 13:43:12 crc kubenswrapper[4970]: I1128 13:43:12.727284 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerStarted","Data":"7311a6b4e2260c3164fb26e95da8c8a9cdcacd22cde8e625c5b8e66dcf39226a"} Nov 28 13:43:13 crc kubenswrapper[4970]: I1128 13:43:13.737920 4970 generic.go:334] "Generic (PLEG): container finished" podID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerID="7311a6b4e2260c3164fb26e95da8c8a9cdcacd22cde8e625c5b8e66dcf39226a" exitCode=0 Nov 28 13:43:13 crc kubenswrapper[4970]: I1128 13:43:13.737973 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerDied","Data":"7311a6b4e2260c3164fb26e95da8c8a9cdcacd22cde8e625c5b8e66dcf39226a"} Nov 28 13:43:14 crc kubenswrapper[4970]: I1128 13:43:14.747192 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerStarted","Data":"34cdf53d07cf1c386720072ce112f0518ae9fe32d43e37a44c3d554d1aa1ada6"} Nov 28 13:43:14 crc kubenswrapper[4970]: I1128 13:43:14.775282 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wj8h4" podStartSLOduration=2.306763793 podStartE2EDuration="4.775263658s" podCreationTimestamp="2025-11-28 13:43:10 +0000 UTC" firstStartedPulling="2025-11-28 13:43:11.715872263 +0000 UTC m=+1402.568754063" lastFinishedPulling="2025-11-28 13:43:14.184372118 +0000 UTC m=+1405.037253928" observedRunningTime="2025-11-28 13:43:14.771878493 +0000 UTC m=+1405.624760333" watchObservedRunningTime="2025-11-28 13:43:14.775263658 +0000 UTC m=+1405.628145468" Nov 28 13:43:17 crc kubenswrapper[4970]: I1128 13:43:17.816338 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:17 crc kubenswrapper[4970]: I1128 13:43:17.817896 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:18 crc kubenswrapper[4970]: I1128 13:43:18.870822 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j964m" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="registry-server" probeResult="failure" output=< Nov 28 13:43:18 crc kubenswrapper[4970]: timeout: failed to connect service ":50051" within 1s Nov 28 13:43:18 crc kubenswrapper[4970]: > Nov 28 13:43:20 crc kubenswrapper[4970]: I1128 13:43:20.992869 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:20 crc kubenswrapper[4970]: I1128 13:43:20.993257 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:21 crc kubenswrapper[4970]: I1128 13:43:21.067586 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:21 crc kubenswrapper[4970]: I1128 13:43:21.846924 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:21 crc kubenswrapper[4970]: I1128 13:43:21.893826 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wj8h4"] Nov 28 13:43:23 crc kubenswrapper[4970]: I1128 13:43:23.819297 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wj8h4" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="registry-server" containerID="cri-o://34cdf53d07cf1c386720072ce112f0518ae9fe32d43e37a44c3d554d1aa1ada6" gracePeriod=2 Nov 28 13:43:24 crc kubenswrapper[4970]: I1128 13:43:24.832050 4970 generic.go:334] "Generic (PLEG): container finished" podID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerID="34cdf53d07cf1c386720072ce112f0518ae9fe32d43e37a44c3d554d1aa1ada6" exitCode=0 Nov 28 13:43:24 crc kubenswrapper[4970]: I1128 13:43:24.832082 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerDied","Data":"34cdf53d07cf1c386720072ce112f0518ae9fe32d43e37a44c3d554d1aa1ada6"} Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.449607 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.505460 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-utilities\") pod \"74c05369-8de7-4fde-a2e1-fd8a7484d151\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.505567 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-catalog-content\") pod \"74c05369-8de7-4fde-a2e1-fd8a7484d151\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.505645 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xrxq\" (UniqueName: \"kubernetes.io/projected/74c05369-8de7-4fde-a2e1-fd8a7484d151-kube-api-access-7xrxq\") pod \"74c05369-8de7-4fde-a2e1-fd8a7484d151\" (UID: \"74c05369-8de7-4fde-a2e1-fd8a7484d151\") " Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.507292 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-utilities" (OuterVolumeSpecName: "utilities") pod "74c05369-8de7-4fde-a2e1-fd8a7484d151" (UID: "74c05369-8de7-4fde-a2e1-fd8a7484d151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.515424 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c05369-8de7-4fde-a2e1-fd8a7484d151-kube-api-access-7xrxq" (OuterVolumeSpecName: "kube-api-access-7xrxq") pod "74c05369-8de7-4fde-a2e1-fd8a7484d151" (UID: "74c05369-8de7-4fde-a2e1-fd8a7484d151"). InnerVolumeSpecName "kube-api-access-7xrxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.568739 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74c05369-8de7-4fde-a2e1-fd8a7484d151" (UID: "74c05369-8de7-4fde-a2e1-fd8a7484d151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.607181 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.607230 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74c05369-8de7-4fde-a2e1-fd8a7484d151-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.607243 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xrxq\" (UniqueName: \"kubernetes.io/projected/74c05369-8de7-4fde-a2e1-fd8a7484d151-kube-api-access-7xrxq\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.847195 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj8h4" event={"ID":"74c05369-8de7-4fde-a2e1-fd8a7484d151","Type":"ContainerDied","Data":"05563f61ac2b8b985f4a2bde813c5bb601b5d1dfea7a22fb1996f3f35734af79"} Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.847316 4970 scope.go:117] "RemoveContainer" containerID="34cdf53d07cf1c386720072ce112f0518ae9fe32d43e37a44c3d554d1aa1ada6" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.847390 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj8h4" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.884076 4970 scope.go:117] "RemoveContainer" containerID="7311a6b4e2260c3164fb26e95da8c8a9cdcacd22cde8e625c5b8e66dcf39226a" Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.896641 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wj8h4"] Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.901589 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wj8h4"] Nov 28 13:43:25 crc kubenswrapper[4970]: I1128 13:43:25.917171 4970 scope.go:117] "RemoveContainer" containerID="4304f24a356d127074fad7c7327136a59a5a2663d473b2d385705ab4e35659a5" Nov 28 13:43:27 crc kubenswrapper[4970]: I1128 13:43:27.395712 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" path="/var/lib/kubelet/pods/74c05369-8de7-4fde-a2e1-fd8a7484d151/volumes" Nov 28 13:43:27 crc kubenswrapper[4970]: I1128 13:43:27.880109 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:27 crc kubenswrapper[4970]: I1128 13:43:27.954568 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:28 crc kubenswrapper[4970]: I1128 13:43:28.703144 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j964m"] Nov 28 13:43:29 crc kubenswrapper[4970]: I1128 13:43:29.887419 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j964m" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="registry-server" containerID="cri-o://c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb" gracePeriod=2 Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.295402 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.484121 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-catalog-content\") pod \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.484319 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdl9b\" (UniqueName: \"kubernetes.io/projected/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-kube-api-access-wdl9b\") pod \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.484527 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-utilities\") pod \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\" (UID: \"6d3e74e9-4f48-4a11-b739-6e3557e81ba5\") " Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.485984 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-utilities" (OuterVolumeSpecName: "utilities") pod "6d3e74e9-4f48-4a11-b739-6e3557e81ba5" (UID: "6d3e74e9-4f48-4a11-b739-6e3557e81ba5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.486572 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.490720 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-kube-api-access-wdl9b" (OuterVolumeSpecName: "kube-api-access-wdl9b") pod "6d3e74e9-4f48-4a11-b739-6e3557e81ba5" (UID: "6d3e74e9-4f48-4a11-b739-6e3557e81ba5"). InnerVolumeSpecName "kube-api-access-wdl9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.587860 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdl9b\" (UniqueName: \"kubernetes.io/projected/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-kube-api-access-wdl9b\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.644737 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d3e74e9-4f48-4a11-b739-6e3557e81ba5" (UID: "6d3e74e9-4f48-4a11-b739-6e3557e81ba5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.689105 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d3e74e9-4f48-4a11-b739-6e3557e81ba5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.900506 4970 generic.go:334] "Generic (PLEG): container finished" podID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerID="c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb" exitCode=0 Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.900581 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerDied","Data":"c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb"} Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.900630 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j964m" event={"ID":"6d3e74e9-4f48-4a11-b739-6e3557e81ba5","Type":"ContainerDied","Data":"41dcc35fb9ff92753ed537a5d327e63f993b27c802ef7a7b9da500e75b1103ea"} Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.900674 4970 scope.go:117] "RemoveContainer" containerID="c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.902504 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j964m" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.928455 4970 scope.go:117] "RemoveContainer" containerID="aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f" Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.955474 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j964m"] Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.964463 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j964m"] Nov 28 13:43:30 crc kubenswrapper[4970]: I1128 13:43:30.982136 4970 scope.go:117] "RemoveContainer" containerID="3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.014102 4970 scope.go:117] "RemoveContainer" containerID="c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb" Nov 28 13:43:31 crc kubenswrapper[4970]: E1128 13:43:31.014680 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb\": container with ID starting with c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb not found: ID does not exist" containerID="c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.014726 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb"} err="failed to get container status \"c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb\": rpc error: code = NotFound desc = could not find container \"c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb\": container with ID starting with c5d043e215d8190fe58aee86c46d6525196670bf61239c64aee0cfe16f2ae0eb not found: ID does not exist" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.014757 4970 scope.go:117] "RemoveContainer" containerID="aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f" Nov 28 13:43:31 crc kubenswrapper[4970]: E1128 13:43:31.015244 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f\": container with ID starting with aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f not found: ID does not exist" containerID="aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.015270 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f"} err="failed to get container status \"aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f\": rpc error: code = NotFound desc = could not find container \"aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f\": container with ID starting with aefd424193bfa244aabaf95680fd69ebe5bf079ddd3c49278256f2b25fcf119f not found: ID does not exist" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.015288 4970 scope.go:117] "RemoveContainer" containerID="3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758" Nov 28 13:43:31 crc kubenswrapper[4970]: E1128 13:43:31.015654 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758\": container with ID starting with 3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758 not found: ID does not exist" containerID="3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.015731 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758"} err="failed to get container status \"3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758\": rpc error: code = NotFound desc = could not find container \"3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758\": container with ID starting with 3efe0570aee29c6ed8bdc1a52ca94828920b789849f04be883a3d7f849e0e758 not found: ID does not exist" Nov 28 13:43:31 crc kubenswrapper[4970]: I1128 13:43:31.392359 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" path="/var/lib/kubelet/pods/6d3e74e9-4f48-4a11-b739-6e3557e81ba5/volumes" Nov 28 13:43:50 crc kubenswrapper[4970]: I1128 13:43:50.629563 4970 scope.go:117] "RemoveContainer" containerID="6701c5716a5247433b429d6f5968196a7fa1e126e588af748466c746cb7cc161" Nov 28 13:43:50 crc kubenswrapper[4970]: I1128 13:43:50.670677 4970 scope.go:117] "RemoveContainer" containerID="8d21e62023d04163a1dc3228bdcf8813cc72a50bac07eb13bc708219ccd0d8f5" Nov 28 13:43:50 crc kubenswrapper[4970]: I1128 13:43:50.722842 4970 scope.go:117] "RemoveContainer" containerID="1ac927e454374093b3f0d8e849e7f1cb74e24e63da4ecb000b30668ead8997d4" Nov 28 13:43:50 crc kubenswrapper[4970]: I1128 13:43:50.768558 4970 scope.go:117] "RemoveContainer" containerID="646cbfa6b7429c72516aecbb90b650d3c46225a5c6b68aa8d3e4f3e7653dc6c9" Nov 28 13:43:50 crc kubenswrapper[4970]: I1128 13:43:50.846998 4970 scope.go:117] "RemoveContainer" containerID="212632bd085249a884fbb45055201c25427f0ae5c9635a47c2d8655d2e01588f" Nov 28 13:43:50 crc kubenswrapper[4970]: I1128 13:43:50.875504 4970 scope.go:117] "RemoveContainer" containerID="5dbcba2b210b3de5accd5a01a27cf55d229ed1171df5f836eda2c2bf5325fd1e" Nov 28 13:43:51 crc kubenswrapper[4970]: I1128 13:43:51.334104 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:43:51 crc kubenswrapper[4970]: I1128 13:43:51.334191 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:44:21 crc kubenswrapper[4970]: I1128 13:44:21.333869 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:44:21 crc kubenswrapper[4970]: I1128 13:44:21.336496 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.292892 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnm6l"] Nov 28 13:44:24 crc kubenswrapper[4970]: E1128 13:44:24.293536 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="extract-content" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.293556 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="extract-content" Nov 28 13:44:24 crc kubenswrapper[4970]: E1128 13:44:24.293575 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="registry-server" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.293589 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="registry-server" Nov 28 13:44:24 crc kubenswrapper[4970]: E1128 13:44:24.293608 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="extract-content" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.293700 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="extract-content" Nov 28 13:44:24 crc kubenswrapper[4970]: E1128 13:44:24.293743 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="extract-utilities" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.293756 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="extract-utilities" Nov 28 13:44:24 crc kubenswrapper[4970]: E1128 13:44:24.293774 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="registry-server" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.293786 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="registry-server" Nov 28 13:44:24 crc kubenswrapper[4970]: E1128 13:44:24.293800 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="extract-utilities" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.293812 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="extract-utilities" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.294010 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c05369-8de7-4fde-a2e1-fd8a7484d151" containerName="registry-server" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.294048 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3e74e9-4f48-4a11-b739-6e3557e81ba5" containerName="registry-server" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.295612 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.313589 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnm6l"] Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.463812 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxx8\" (UniqueName: \"kubernetes.io/projected/5a86464f-d479-4657-a68e-258786d37cad-kube-api-access-rvxx8\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.464241 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-utilities\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.464276 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-catalog-content\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.565976 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-utilities\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.566041 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-catalog-content\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.566078 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxx8\" (UniqueName: \"kubernetes.io/projected/5a86464f-d479-4657-a68e-258786d37cad-kube-api-access-rvxx8\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.566939 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-utilities\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.567010 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-catalog-content\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.593493 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxx8\" (UniqueName: \"kubernetes.io/projected/5a86464f-d479-4657-a68e-258786d37cad-kube-api-access-rvxx8\") pod \"certified-operators-vnm6l\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.614708 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:24 crc kubenswrapper[4970]: I1128 13:44:24.905494 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnm6l"] Nov 28 13:44:25 crc kubenswrapper[4970]: I1128 13:44:25.409491 4970 generic.go:334] "Generic (PLEG): container finished" podID="5a86464f-d479-4657-a68e-258786d37cad" containerID="453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50" exitCode=0 Nov 28 13:44:25 crc kubenswrapper[4970]: I1128 13:44:25.409558 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnm6l" event={"ID":"5a86464f-d479-4657-a68e-258786d37cad","Type":"ContainerDied","Data":"453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50"} Nov 28 13:44:25 crc kubenswrapper[4970]: I1128 13:44:25.409595 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnm6l" event={"ID":"5a86464f-d479-4657-a68e-258786d37cad","Type":"ContainerStarted","Data":"40877ff8cce99d404c0d1a2d05e7790cc8b78b4ec199c190010f3f2931ef2ded"} Nov 28 13:44:27 crc kubenswrapper[4970]: I1128 13:44:27.442294 4970 generic.go:334] "Generic (PLEG): container finished" podID="5a86464f-d479-4657-a68e-258786d37cad" containerID="9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501" exitCode=0 Nov 28 13:44:27 crc kubenswrapper[4970]: I1128 13:44:27.442413 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnm6l" event={"ID":"5a86464f-d479-4657-a68e-258786d37cad","Type":"ContainerDied","Data":"9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501"} Nov 28 13:44:28 crc kubenswrapper[4970]: I1128 13:44:28.465522 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnm6l" event={"ID":"5a86464f-d479-4657-a68e-258786d37cad","Type":"ContainerStarted","Data":"04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f"} Nov 28 13:44:28 crc kubenswrapper[4970]: I1128 13:44:28.495419 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vnm6l" podStartSLOduration=1.93160534 podStartE2EDuration="4.4953925s" podCreationTimestamp="2025-11-28 13:44:24 +0000 UTC" firstStartedPulling="2025-11-28 13:44:25.412511519 +0000 UTC m=+1476.265393329" lastFinishedPulling="2025-11-28 13:44:27.976298679 +0000 UTC m=+1478.829180489" observedRunningTime="2025-11-28 13:44:28.490701059 +0000 UTC m=+1479.343582919" watchObservedRunningTime="2025-11-28 13:44:28.4953925 +0000 UTC m=+1479.348274340" Nov 28 13:44:34 crc kubenswrapper[4970]: I1128 13:44:34.615111 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:34 crc kubenswrapper[4970]: I1128 13:44:34.615995 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:34 crc kubenswrapper[4970]: I1128 13:44:34.668075 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:35 crc kubenswrapper[4970]: I1128 13:44:35.598448 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:35 crc kubenswrapper[4970]: I1128 13:44:35.662507 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnm6l"] Nov 28 13:44:37 crc kubenswrapper[4970]: I1128 13:44:37.541697 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vnm6l" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="registry-server" containerID="cri-o://04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f" gracePeriod=2 Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.448386 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.507576 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-utilities\") pod \"5a86464f-d479-4657-a68e-258786d37cad\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.507697 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvxx8\" (UniqueName: \"kubernetes.io/projected/5a86464f-d479-4657-a68e-258786d37cad-kube-api-access-rvxx8\") pod \"5a86464f-d479-4657-a68e-258786d37cad\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.507728 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-catalog-content\") pod \"5a86464f-d479-4657-a68e-258786d37cad\" (UID: \"5a86464f-d479-4657-a68e-258786d37cad\") " Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.508833 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-utilities" (OuterVolumeSpecName: "utilities") pod "5a86464f-d479-4657-a68e-258786d37cad" (UID: "5a86464f-d479-4657-a68e-258786d37cad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.513447 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a86464f-d479-4657-a68e-258786d37cad-kube-api-access-rvxx8" (OuterVolumeSpecName: "kube-api-access-rvxx8") pod "5a86464f-d479-4657-a68e-258786d37cad" (UID: "5a86464f-d479-4657-a68e-258786d37cad"). InnerVolumeSpecName "kube-api-access-rvxx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.550517 4970 generic.go:334] "Generic (PLEG): container finished" podID="5a86464f-d479-4657-a68e-258786d37cad" containerID="04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f" exitCode=0 Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.550559 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnm6l" event={"ID":"5a86464f-d479-4657-a68e-258786d37cad","Type":"ContainerDied","Data":"04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f"} Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.550567 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnm6l" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.550583 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnm6l" event={"ID":"5a86464f-d479-4657-a68e-258786d37cad","Type":"ContainerDied","Data":"40877ff8cce99d404c0d1a2d05e7790cc8b78b4ec199c190010f3f2931ef2ded"} Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.550603 4970 scope.go:117] "RemoveContainer" containerID="04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.557649 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a86464f-d479-4657-a68e-258786d37cad" (UID: "5a86464f-d479-4657-a68e-258786d37cad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.572989 4970 scope.go:117] "RemoveContainer" containerID="9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.589142 4970 scope.go:117] "RemoveContainer" containerID="453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.608758 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvxx8\" (UniqueName: \"kubernetes.io/projected/5a86464f-d479-4657-a68e-258786d37cad-kube-api-access-rvxx8\") on node \"crc\" DevicePath \"\"" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.608944 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.608954 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a86464f-d479-4657-a68e-258786d37cad-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.615330 4970 scope.go:117] "RemoveContainer" containerID="04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f" Nov 28 13:44:38 crc kubenswrapper[4970]: E1128 13:44:38.615814 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f\": container with ID starting with 04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f not found: ID does not exist" containerID="04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.615865 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f"} err="failed to get container status \"04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f\": rpc error: code = NotFound desc = could not find container \"04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f\": container with ID starting with 04a9c22a220dec26538a8dd291a357ef1cd5173bdf4cde83f10b72e54fefc12f not found: ID does not exist" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.615896 4970 scope.go:117] "RemoveContainer" containerID="9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501" Nov 28 13:44:38 crc kubenswrapper[4970]: E1128 13:44:38.616289 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501\": container with ID starting with 9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501 not found: ID does not exist" containerID="9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.616315 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501"} err="failed to get container status \"9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501\": rpc error: code = NotFound desc = could not find container \"9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501\": container with ID starting with 9e239fb0820bda6fd54268570778aeca60aa9cae1dec8ea018041214d77d3501 not found: ID does not exist" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.616334 4970 scope.go:117] "RemoveContainer" containerID="453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50" Nov 28 13:44:38 crc kubenswrapper[4970]: E1128 13:44:38.616620 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50\": container with ID starting with 453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50 not found: ID does not exist" containerID="453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.616641 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50"} err="failed to get container status \"453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50\": rpc error: code = NotFound desc = could not find container \"453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50\": container with ID starting with 453a476fea5cf875543296fddc8e97dae94279f563a06ce4f00480b1f6ae7a50 not found: ID does not exist" Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.883478 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnm6l"] Nov 28 13:44:38 crc kubenswrapper[4970]: I1128 13:44:38.914637 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vnm6l"] Nov 28 13:44:39 crc kubenswrapper[4970]: I1128 13:44:39.395165 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a86464f-d479-4657-a68e-258786d37cad" path="/var/lib/kubelet/pods/5a86464f-d479-4657-a68e-258786d37cad/volumes" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.918255 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p8ckr"] Nov 28 13:44:49 crc kubenswrapper[4970]: E1128 13:44:49.919095 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="extract-utilities" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.919117 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="extract-utilities" Nov 28 13:44:49 crc kubenswrapper[4970]: E1128 13:44:49.919163 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="extract-content" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.919178 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="extract-content" Nov 28 13:44:49 crc kubenswrapper[4970]: E1128 13:44:49.919193 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="registry-server" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.919206 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="registry-server" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.919458 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a86464f-d479-4657-a68e-258786d37cad" containerName="registry-server" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.920954 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:49 crc kubenswrapper[4970]: I1128 13:44:49.932392 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8ckr"] Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.085859 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpv8\" (UniqueName: \"kubernetes.io/projected/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-kube-api-access-kbpv8\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.086098 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-utilities\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.086595 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-catalog-content\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.187739 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpv8\" (UniqueName: \"kubernetes.io/projected/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-kube-api-access-kbpv8\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.187806 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-utilities\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.187851 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-catalog-content\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.188422 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-catalog-content\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.188502 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-utilities\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.209261 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpv8\" (UniqueName: \"kubernetes.io/projected/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-kube-api-access-kbpv8\") pod \"redhat-marketplace-p8ckr\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.288251 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.467566 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8ckr"] Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.661077 4970 generic.go:334] "Generic (PLEG): container finished" podID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerID="71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e" exitCode=0 Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.661120 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8ckr" event={"ID":"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6","Type":"ContainerDied","Data":"71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e"} Nov 28 13:44:50 crc kubenswrapper[4970]: I1128 13:44:50.661146 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8ckr" event={"ID":"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6","Type":"ContainerStarted","Data":"7c239904717c23e8d53b2db9c06a41ba998412a4ec809a6618b9ca468cfbfc92"} Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.334021 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.334470 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.334544 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.335403 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.335484 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" gracePeriod=600 Nov 28 13:44:51 crc kubenswrapper[4970]: E1128 13:44:51.477409 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.671412 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" exitCode=0 Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.671473 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50"} Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.671541 4970 scope.go:117] "RemoveContainer" containerID="3b2419e8c0d194a5a29f4c079224199be6f97c77dca32e7413992a8fbfc0b4d2" Nov 28 13:44:51 crc kubenswrapper[4970]: I1128 13:44:51.672201 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:44:51 crc kubenswrapper[4970]: E1128 13:44:51.672863 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:44:52 crc kubenswrapper[4970]: I1128 13:44:52.679065 4970 generic.go:334] "Generic (PLEG): container finished" podID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerID="9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1" exitCode=0 Nov 28 13:44:52 crc kubenswrapper[4970]: I1128 13:44:52.679175 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8ckr" event={"ID":"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6","Type":"ContainerDied","Data":"9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1"} Nov 28 13:44:53 crc kubenswrapper[4970]: I1128 13:44:53.692197 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8ckr" event={"ID":"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6","Type":"ContainerStarted","Data":"4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722"} Nov 28 13:44:53 crc kubenswrapper[4970]: I1128 13:44:53.717777 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p8ckr" podStartSLOduration=2.267902343 podStartE2EDuration="4.717750425s" podCreationTimestamp="2025-11-28 13:44:49 +0000 UTC" firstStartedPulling="2025-11-28 13:44:50.663359581 +0000 UTC m=+1501.516241381" lastFinishedPulling="2025-11-28 13:44:53.113207663 +0000 UTC m=+1503.966089463" observedRunningTime="2025-11-28 13:44:53.711008477 +0000 UTC m=+1504.563890317" watchObservedRunningTime="2025-11-28 13:44:53.717750425 +0000 UTC m=+1504.570632265" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.153232 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr"] Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.154449 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.157256 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.157265 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.173990 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr"] Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.288897 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.288988 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.330598 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-secret-volume\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.330981 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-config-volume\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.331167 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px66c\" (UniqueName: \"kubernetes.io/projected/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-kube-api-access-px66c\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.344964 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.433339 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px66c\" (UniqueName: \"kubernetes.io/projected/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-kube-api-access-px66c\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.433487 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-secret-volume\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.433562 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-config-volume\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.435662 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-config-volume\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.443654 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-secret-volume\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.456351 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px66c\" (UniqueName: \"kubernetes.io/projected/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-kube-api-access-px66c\") pod \"collect-profiles-29405625-nkqmr\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.476606 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.796920 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.838884 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8ckr"] Nov 28 13:45:00 crc kubenswrapper[4970]: I1128 13:45:00.896032 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr"] Nov 28 13:45:01 crc kubenswrapper[4970]: I1128 13:45:01.754691 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" event={"ID":"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee","Type":"ContainerStarted","Data":"6256dd75623e8191e3481ea1f4c72311a1eaf9059eccbb0bc7adc5394fd524d0"} Nov 28 13:45:01 crc kubenswrapper[4970]: I1128 13:45:01.755026 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" event={"ID":"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee","Type":"ContainerStarted","Data":"a3eed645129d1562bee4033e27d7823738aa64f432e11397db87bf81fa6942ce"} Nov 28 13:45:01 crc kubenswrapper[4970]: I1128 13:45:01.776705 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" podStartSLOduration=1.776682465 podStartE2EDuration="1.776682465s" podCreationTimestamp="2025-11-28 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:45:01.774441633 +0000 UTC m=+1512.627323453" watchObservedRunningTime="2025-11-28 13:45:01.776682465 +0000 UTC m=+1512.629564275" Nov 28 13:45:02 crc kubenswrapper[4970]: I1128 13:45:02.773369 4970 generic.go:334] "Generic (PLEG): container finished" podID="fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" containerID="6256dd75623e8191e3481ea1f4c72311a1eaf9059eccbb0bc7adc5394fd524d0" exitCode=0 Nov 28 13:45:02 crc kubenswrapper[4970]: I1128 13:45:02.773477 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" event={"ID":"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee","Type":"ContainerDied","Data":"6256dd75623e8191e3481ea1f4c72311a1eaf9059eccbb0bc7adc5394fd524d0"} Nov 28 13:45:02 crc kubenswrapper[4970]: I1128 13:45:02.773631 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p8ckr" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="registry-server" containerID="cri-o://4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722" gracePeriod=2 Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.273864 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.386878 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-catalog-content\") pod \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.387272 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-utilities\") pod \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.387363 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbpv8\" (UniqueName: \"kubernetes.io/projected/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-kube-api-access-kbpv8\") pod \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\" (UID: \"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6\") " Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.388803 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-utilities" (OuterVolumeSpecName: "utilities") pod "7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" (UID: "7ad4940c-86a5-4aa0-b805-e8193b8b9ff6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.394075 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-kube-api-access-kbpv8" (OuterVolumeSpecName: "kube-api-access-kbpv8") pod "7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" (UID: "7ad4940c-86a5-4aa0-b805-e8193b8b9ff6"). InnerVolumeSpecName "kube-api-access-kbpv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.411043 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" (UID: "7ad4940c-86a5-4aa0-b805-e8193b8b9ff6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.488954 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.489015 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbpv8\" (UniqueName: \"kubernetes.io/projected/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-kube-api-access-kbpv8\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.489030 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.784438 4970 generic.go:334] "Generic (PLEG): container finished" podID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerID="4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722" exitCode=0 Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.784541 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8ckr" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.784643 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8ckr" event={"ID":"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6","Type":"ContainerDied","Data":"4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722"} Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.785000 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8ckr" event={"ID":"7ad4940c-86a5-4aa0-b805-e8193b8b9ff6","Type":"ContainerDied","Data":"7c239904717c23e8d53b2db9c06a41ba998412a4ec809a6618b9ca468cfbfc92"} Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.785045 4970 scope.go:117] "RemoveContainer" containerID="4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.814353 4970 scope.go:117] "RemoveContainer" containerID="9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.845732 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8ckr"] Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.851370 4970 scope.go:117] "RemoveContainer" containerID="71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.852866 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8ckr"] Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.875680 4970 scope.go:117] "RemoveContainer" containerID="4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722" Nov 28 13:45:03 crc kubenswrapper[4970]: E1128 13:45:03.876204 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722\": container with ID starting with 4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722 not found: ID does not exist" containerID="4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.876280 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722"} err="failed to get container status \"4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722\": rpc error: code = NotFound desc = could not find container \"4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722\": container with ID starting with 4c70a80241d4272f500cdc2ea1e45975e356dc41c64db9ba439317dd12582722 not found: ID does not exist" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.876331 4970 scope.go:117] "RemoveContainer" containerID="9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1" Nov 28 13:45:03 crc kubenswrapper[4970]: E1128 13:45:03.876787 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1\": container with ID starting with 9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1 not found: ID does not exist" containerID="9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.876845 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1"} err="failed to get container status \"9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1\": rpc error: code = NotFound desc = could not find container \"9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1\": container with ID starting with 9de7b437f12a7e92ce4877ed62ea1f2d2e30dbd23e458ae44e01b3adea3c34f1 not found: ID does not exist" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.876868 4970 scope.go:117] "RemoveContainer" containerID="71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e" Nov 28 13:45:03 crc kubenswrapper[4970]: E1128 13:45:03.877399 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e\": container with ID starting with 71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e not found: ID does not exist" containerID="71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e" Nov 28 13:45:03 crc kubenswrapper[4970]: I1128 13:45:03.877427 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e"} err="failed to get container status \"71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e\": rpc error: code = NotFound desc = could not find container \"71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e\": container with ID starting with 71480ca768ef049b29bbfe682f424ffeb210d351d1be20cf24e15058c17f462e not found: ID does not exist" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.126695 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.302001 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-secret-volume\") pod \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.302072 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px66c\" (UniqueName: \"kubernetes.io/projected/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-kube-api-access-px66c\") pod \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.302110 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-config-volume\") pod \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\" (UID: \"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee\") " Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.304083 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" (UID: "fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.305101 4970 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.309286 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-kube-api-access-px66c" (OuterVolumeSpecName: "kube-api-access-px66c") pod "fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" (UID: "fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee"). InnerVolumeSpecName "kube-api-access-px66c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.315277 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" (UID: "fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.381618 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:45:04 crc kubenswrapper[4970]: E1128 13:45:04.382002 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.407908 4970 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.407965 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px66c\" (UniqueName: \"kubernetes.io/projected/fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee-kube-api-access-px66c\") on node \"crc\" DevicePath \"\"" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.796848 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" event={"ID":"fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee","Type":"ContainerDied","Data":"a3eed645129d1562bee4033e27d7823738aa64f432e11397db87bf81fa6942ce"} Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.796895 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3eed645129d1562bee4033e27d7823738aa64f432e11397db87bf81fa6942ce" Nov 28 13:45:04 crc kubenswrapper[4970]: I1128 13:45:04.796860 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405625-nkqmr" Nov 28 13:45:05 crc kubenswrapper[4970]: I1128 13:45:05.391828 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" path="/var/lib/kubelet/pods/7ad4940c-86a5-4aa0-b805-e8193b8b9ff6/volumes" Nov 28 13:45:15 crc kubenswrapper[4970]: I1128 13:45:15.381956 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:45:15 crc kubenswrapper[4970]: E1128 13:45:15.383143 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:45:29 crc kubenswrapper[4970]: I1128 13:45:29.384896 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:45:29 crc kubenswrapper[4970]: E1128 13:45:29.385640 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:45:44 crc kubenswrapper[4970]: I1128 13:45:44.381355 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:45:44 crc kubenswrapper[4970]: E1128 13:45:44.382026 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:45:51 crc kubenswrapper[4970]: I1128 13:45:51.061359 4970 scope.go:117] "RemoveContainer" containerID="877de299d69ff0ee836fcef3c0c71152a9538f3cfe202ca93873aab38f1c641f" Nov 28 13:45:51 crc kubenswrapper[4970]: I1128 13:45:51.112110 4970 scope.go:117] "RemoveContainer" containerID="93944f73dc0636948b106db60dba8d56aa4e910ad8e953783c5ea33264ecf117" Nov 28 13:45:51 crc kubenswrapper[4970]: I1128 13:45:51.148658 4970 scope.go:117] "RemoveContainer" containerID="d3be7ca8abac0727af7754c96865eff56935a731cae22281811af54886785ab2" Nov 28 13:45:51 crc kubenswrapper[4970]: I1128 13:45:51.196539 4970 scope.go:117] "RemoveContainer" containerID="fe75df7012727427cb85fb6507dde88d14cc7f28c697916077fd01fe888ccea5" Nov 28 13:45:51 crc kubenswrapper[4970]: I1128 13:45:51.211797 4970 scope.go:117] "RemoveContainer" containerID="8d5290d1b1410eb750de6e70caeb91fa0c9236d2a5f735a059577ae1ed60c3ef" Nov 28 13:45:57 crc kubenswrapper[4970]: I1128 13:45:57.381211 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:45:57 crc kubenswrapper[4970]: E1128 13:45:57.382441 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:46:09 crc kubenswrapper[4970]: I1128 13:46:09.381883 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:46:09 crc kubenswrapper[4970]: E1128 13:46:09.382986 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:46:23 crc kubenswrapper[4970]: I1128 13:46:23.381530 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:46:23 crc kubenswrapper[4970]: E1128 13:46:23.384138 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:46:36 crc kubenswrapper[4970]: I1128 13:46:36.381008 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:46:36 crc kubenswrapper[4970]: E1128 13:46:36.382781 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:46:50 crc kubenswrapper[4970]: I1128 13:46:50.380992 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:46:50 crc kubenswrapper[4970]: E1128 13:46:50.381694 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:46:51 crc kubenswrapper[4970]: I1128 13:46:51.321449 4970 scope.go:117] "RemoveContainer" containerID="82715e6f6166f873f9da6a9ee4768eb74cec82c4e44e80311b9edafe2c0e8cdd" Nov 28 13:46:51 crc kubenswrapper[4970]: I1128 13:46:51.345883 4970 scope.go:117] "RemoveContainer" containerID="61f3b0e29d43cf715e6e77ba454db2738d3b86164f0d058812d4d752197ddb83" Nov 28 13:46:51 crc kubenswrapper[4970]: I1128 13:46:51.374499 4970 scope.go:117] "RemoveContainer" containerID="20ebd83a1a3d7e6f7a0f39226452867a290148296adaef862f741b6ea6619d01" Nov 28 13:46:51 crc kubenswrapper[4970]: I1128 13:46:51.415366 4970 scope.go:117] "RemoveContainer" containerID="1f3aa2553a84343080af5009b2640fd359598a4ef84e55cfabfa5180558a8ff7" Nov 28 13:46:51 crc kubenswrapper[4970]: I1128 13:46:51.449908 4970 scope.go:117] "RemoveContainer" containerID="dca9d8e843ae6879f29de64545c0c1a8b168e16f9250bf61751bc68178232cab" Nov 28 13:47:01 crc kubenswrapper[4970]: I1128 13:47:01.381310 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:47:01 crc kubenswrapper[4970]: E1128 13:47:01.382162 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:47:14 crc kubenswrapper[4970]: I1128 13:47:14.381352 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:47:14 crc kubenswrapper[4970]: E1128 13:47:14.382523 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:47:28 crc kubenswrapper[4970]: I1128 13:47:28.381525 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:47:28 crc kubenswrapper[4970]: E1128 13:47:28.382661 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:47:39 crc kubenswrapper[4970]: I1128 13:47:39.384926 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:47:39 crc kubenswrapper[4970]: E1128 13:47:39.385634 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:47:51 crc kubenswrapper[4970]: I1128 13:47:51.556689 4970 scope.go:117] "RemoveContainer" containerID="ce6c74640208dae6d3e76db85346f9d929887c92a252876e5f8c0da4a7f8d373" Nov 28 13:47:53 crc kubenswrapper[4970]: I1128 13:47:53.381510 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:47:53 crc kubenswrapper[4970]: E1128 13:47:53.382362 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:48:04 crc kubenswrapper[4970]: I1128 13:48:04.380867 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:48:04 crc kubenswrapper[4970]: E1128 13:48:04.382984 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:48:17 crc kubenswrapper[4970]: I1128 13:48:17.381829 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:48:17 crc kubenswrapper[4970]: E1128 13:48:17.382617 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:48:31 crc kubenswrapper[4970]: I1128 13:48:31.381475 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:48:31 crc kubenswrapper[4970]: E1128 13:48:31.382605 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:48:44 crc kubenswrapper[4970]: I1128 13:48:44.381089 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:48:44 crc kubenswrapper[4970]: E1128 13:48:44.381883 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:48:59 crc kubenswrapper[4970]: I1128 13:48:59.388282 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:48:59 crc kubenswrapper[4970]: E1128 13:48:59.389326 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:49:12 crc kubenswrapper[4970]: I1128 13:49:12.381999 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:49:12 crc kubenswrapper[4970]: E1128 13:49:12.383162 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:49:23 crc kubenswrapper[4970]: I1128 13:49:23.380852 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:49:23 crc kubenswrapper[4970]: E1128 13:49:23.382131 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:49:37 crc kubenswrapper[4970]: I1128 13:49:37.381569 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:49:37 crc kubenswrapper[4970]: E1128 13:49:37.382380 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:49:51 crc kubenswrapper[4970]: I1128 13:49:51.381254 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:49:51 crc kubenswrapper[4970]: E1128 13:49:51.381875 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:50:03 crc kubenswrapper[4970]: I1128 13:50:03.380798 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:50:04 crc kubenswrapper[4970]: I1128 13:50:04.432752 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"989ffc3243eb185213e945a6f9a9c46c00572aa829c5947a5b48998743fc78c3"} Nov 28 13:51:17 crc kubenswrapper[4970]: I1128 13:51:17.052742 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8m9wz"] Nov 28 13:51:17 crc kubenswrapper[4970]: I1128 13:51:17.062919 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-8m9wz"] Nov 28 13:51:17 crc kubenswrapper[4970]: I1128 13:51:17.072511 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q"] Nov 28 13:51:17 crc kubenswrapper[4970]: I1128 13:51:17.078556 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-85fc-account-create-update-dth5q"] Nov 28 13:51:17 crc kubenswrapper[4970]: I1128 13:51:17.389409 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0015a1f-b332-4d23-9c85-83fce1551460" path="/var/lib/kubelet/pods/a0015a1f-b332-4d23-9c85-83fce1551460/volumes" Nov 28 13:51:17 crc kubenswrapper[4970]: I1128 13:51:17.390492 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97c0e2d-5899-4b03-94ef-d0ade8964b2d" path="/var/lib/kubelet/pods/d97c0e2d-5899-4b03-94ef-d0ade8964b2d/volumes" Nov 28 13:51:25 crc kubenswrapper[4970]: I1128 13:51:25.022463 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-b9wh9"] Nov 28 13:51:25 crc kubenswrapper[4970]: I1128 13:51:25.028946 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-b9wh9"] Nov 28 13:51:25 crc kubenswrapper[4970]: I1128 13:51:25.388807 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24bb341-be92-44c8-bcea-d15bcb539a26" path="/var/lib/kubelet/pods/c24bb341-be92-44c8-bcea-d15bcb539a26/volumes" Nov 28 13:51:30 crc kubenswrapper[4970]: I1128 13:51:30.035389 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kc54s"] Nov 28 13:51:30 crc kubenswrapper[4970]: I1128 13:51:30.043376 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-kc54s"] Nov 28 13:51:31 crc kubenswrapper[4970]: I1128 13:51:31.390574 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22c4593-6019-4a0c-9ca3-2da7907946be" path="/var/lib/kubelet/pods/d22c4593-6019-4a0c-9ca3-2da7907946be/volumes" Nov 28 13:51:51 crc kubenswrapper[4970]: I1128 13:51:51.675999 4970 scope.go:117] "RemoveContainer" containerID="f7241326c4b9209487d1844c103cedd43712580a7e4ee2a4627478f5efd866ed" Nov 28 13:51:51 crc kubenswrapper[4970]: I1128 13:51:51.701261 4970 scope.go:117] "RemoveContainer" containerID="1c27b6736a74509268beb39adc22f4d98ca67cf21927baba449a6aa64b66a4ca" Nov 28 13:51:51 crc kubenswrapper[4970]: I1128 13:51:51.728979 4970 scope.go:117] "RemoveContainer" containerID="b4a1d256477224d248ffc936d1c5d9aca1aaae06c142ef33adec4c8996aa215b" Nov 28 13:51:51 crc kubenswrapper[4970]: I1128 13:51:51.786185 4970 scope.go:117] "RemoveContainer" containerID="d995d3c0dd322a8417043e4c613395fe5c7c2e6637d3313257ef13a9ec9b05a2" Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.075822 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.076697 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstackclient" podUID="4712809e-99af-47ba-84d0-085f3b07f326" containerName="openstackclient" containerID="cri-o://7096b7c56697c221da2868892e97f77650355123a3a886f93cfdc446b2c286b3" gracePeriod=30 Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.569521 4970 generic.go:334] "Generic (PLEG): container finished" podID="4712809e-99af-47ba-84d0-085f3b07f326" containerID="7096b7c56697c221da2868892e97f77650355123a3a886f93cfdc446b2c286b3" exitCode=143 Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.569673 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4712809e-99af-47ba-84d0-085f3b07f326","Type":"ContainerDied","Data":"7096b7c56697c221da2868892e97f77650355123a3a886f93cfdc446b2c286b3"} Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.832351 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.958527 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config\") pod \"4712809e-99af-47ba-84d0-085f3b07f326\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.958715 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6htt\" (UniqueName: \"kubernetes.io/projected/4712809e-99af-47ba-84d0-085f3b07f326-kube-api-access-c6htt\") pod \"4712809e-99af-47ba-84d0-085f3b07f326\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.958806 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config-secret\") pod \"4712809e-99af-47ba-84d0-085f3b07f326\" (UID: \"4712809e-99af-47ba-84d0-085f3b07f326\") " Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.966672 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4712809e-99af-47ba-84d0-085f3b07f326-kube-api-access-c6htt" (OuterVolumeSpecName: "kube-api-access-c6htt") pod "4712809e-99af-47ba-84d0-085f3b07f326" (UID: "4712809e-99af-47ba-84d0-085f3b07f326"). InnerVolumeSpecName "kube-api-access-c6htt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.986824 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4712809e-99af-47ba-84d0-085f3b07f326" (UID: "4712809e-99af-47ba-84d0-085f3b07f326"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:19 crc kubenswrapper[4970]: I1128 13:52:19.996801 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4712809e-99af-47ba-84d0-085f3b07f326" (UID: "4712809e-99af-47ba-84d0-085f3b07f326"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.060584 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6htt\" (UniqueName: \"kubernetes.io/projected/4712809e-99af-47ba-84d0-085f3b07f326-kube-api-access-c6htt\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.060645 4970 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.060664 4970 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4712809e-99af-47ba-84d0-085f3b07f326-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.579759 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"4712809e-99af-47ba-84d0-085f3b07f326","Type":"ContainerDied","Data":"3067979218a3b1a4aa085d5831a366af93218516ffa586b7c8339599a1b99c19"} Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.579835 4970 scope.go:117] "RemoveContainer" containerID="7096b7c56697c221da2868892e97f77650355123a3a886f93cfdc446b2c286b3" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.579835 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.615393 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.619874 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.828989 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7f84979c99-7g2v7"] Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.829255 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" podUID="9232407d-077b-4ed6-9350-ee386d73677d" containerName="keystone-api" containerID="cri-o://bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d" gracePeriod=30 Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.924511 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone85fc-account-delete-8nhcl"] Nov 28 13:52:20 crc kubenswrapper[4970]: E1128 13:52:20.924778 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="extract-utilities" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.924801 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="extract-utilities" Nov 28 13:52:20 crc kubenswrapper[4970]: E1128 13:52:20.924829 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="extract-content" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.924837 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="extract-content" Nov 28 13:52:20 crc kubenswrapper[4970]: E1128 13:52:20.924845 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="registry-server" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.924852 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="registry-server" Nov 28 13:52:20 crc kubenswrapper[4970]: E1128 13:52:20.924866 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" containerName="collect-profiles" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.924874 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" containerName="collect-profiles" Nov 28 13:52:20 crc kubenswrapper[4970]: E1128 13:52:20.924884 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4712809e-99af-47ba-84d0-085f3b07f326" containerName="openstackclient" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.924891 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4712809e-99af-47ba-84d0-085f3b07f326" containerName="openstackclient" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.925013 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8cddf3-a4bf-4e9c-bce1-51b19e1fefee" containerName="collect-profiles" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.925026 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad4940c-86a5-4aa0-b805-e8193b8b9ff6" containerName="registry-server" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.925043 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4712809e-99af-47ba-84d0-085f3b07f326" containerName="openstackclient" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.925583 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:20 crc kubenswrapper[4970]: I1128 13:52:20.939739 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone85fc-account-delete-8nhcl"] Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.125705 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjx88\" (UniqueName: \"kubernetes.io/projected/2207ac08-790e-4eff-83cb-82a7b52344ff-kube-api-access-wjx88\") pod \"keystone85fc-account-delete-8nhcl\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.125770 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2207ac08-790e-4eff-83cb-82a7b52344ff-operator-scripts\") pod \"keystone85fc-account-delete-8nhcl\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.227527 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjx88\" (UniqueName: \"kubernetes.io/projected/2207ac08-790e-4eff-83cb-82a7b52344ff-kube-api-access-wjx88\") pod \"keystone85fc-account-delete-8nhcl\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.227617 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2207ac08-790e-4eff-83cb-82a7b52344ff-operator-scripts\") pod \"keystone85fc-account-delete-8nhcl\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.228699 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2207ac08-790e-4eff-83cb-82a7b52344ff-operator-scripts\") pod \"keystone85fc-account-delete-8nhcl\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.253028 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjx88\" (UniqueName: \"kubernetes.io/projected/2207ac08-790e-4eff-83cb-82a7b52344ff-kube-api-access-wjx88\") pod \"keystone85fc-account-delete-8nhcl\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.333774 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.334069 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.393741 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4712809e-99af-47ba-84d0-085f3b07f326" path="/var/lib/kubelet/pods/4712809e-99af-47ba-84d0-085f3b07f326/volumes" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.538721 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:21 crc kubenswrapper[4970]: I1128 13:52:21.987979 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone85fc-account-delete-8nhcl"] Nov 28 13:52:21 crc kubenswrapper[4970]: W1128 13:52:21.998725 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2207ac08_790e_4eff_83cb_82a7b52344ff.slice/crio-e6d3042c13281adb0e3519621c6241aa3c7c52c2fab358c0a7f21048172b2651 WatchSource:0}: Error finding container e6d3042c13281adb0e3519621c6241aa3c7c52c2fab358c0a7f21048172b2651: Status 404 returned error can't find the container with id e6d3042c13281adb0e3519621c6241aa3c7c52c2fab358c0a7f21048172b2651 Nov 28 13:52:22 crc kubenswrapper[4970]: I1128 13:52:22.595730 4970 generic.go:334] "Generic (PLEG): container finished" podID="2207ac08-790e-4eff-83cb-82a7b52344ff" containerID="d6a44d57dac70abf7616d423ca23ce27df639c9eb6e1b84f50747db23b275a6d" exitCode=0 Nov 28 13:52:22 crc kubenswrapper[4970]: I1128 13:52:22.595861 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" event={"ID":"2207ac08-790e-4eff-83cb-82a7b52344ff","Type":"ContainerDied","Data":"d6a44d57dac70abf7616d423ca23ce27df639c9eb6e1b84f50747db23b275a6d"} Nov 28 13:52:22 crc kubenswrapper[4970]: I1128 13:52:22.596050 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" event={"ID":"2207ac08-790e-4eff-83cb-82a7b52344ff","Type":"ContainerStarted","Data":"e6d3042c13281adb0e3519621c6241aa3c7c52c2fab358c0a7f21048172b2651"} Nov 28 13:52:23 crc kubenswrapper[4970]: I1128 13:52:23.838861 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:23 crc kubenswrapper[4970]: I1128 13:52:23.963358 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2207ac08-790e-4eff-83cb-82a7b52344ff-operator-scripts\") pod \"2207ac08-790e-4eff-83cb-82a7b52344ff\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " Nov 28 13:52:23 crc kubenswrapper[4970]: I1128 13:52:23.963414 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjx88\" (UniqueName: \"kubernetes.io/projected/2207ac08-790e-4eff-83cb-82a7b52344ff-kube-api-access-wjx88\") pod \"2207ac08-790e-4eff-83cb-82a7b52344ff\" (UID: \"2207ac08-790e-4eff-83cb-82a7b52344ff\") " Nov 28 13:52:23 crc kubenswrapper[4970]: I1128 13:52:23.964552 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2207ac08-790e-4eff-83cb-82a7b52344ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2207ac08-790e-4eff-83cb-82a7b52344ff" (UID: "2207ac08-790e-4eff-83cb-82a7b52344ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:23 crc kubenswrapper[4970]: I1128 13:52:23.970129 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2207ac08-790e-4eff-83cb-82a7b52344ff-kube-api-access-wjx88" (OuterVolumeSpecName: "kube-api-access-wjx88") pod "2207ac08-790e-4eff-83cb-82a7b52344ff" (UID: "2207ac08-790e-4eff-83cb-82a7b52344ff"). InnerVolumeSpecName "kube-api-access-wjx88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.064768 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2207ac08-790e-4eff-83cb-82a7b52344ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.064809 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjx88\" (UniqueName: \"kubernetes.io/projected/2207ac08-790e-4eff-83cb-82a7b52344ff-kube-api-access-wjx88\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.348476 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.470018 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-fernet-keys\") pod \"9232407d-077b-4ed6-9350-ee386d73677d\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.470109 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-scripts\") pod \"9232407d-077b-4ed6-9350-ee386d73677d\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.470194 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-config-data\") pod \"9232407d-077b-4ed6-9350-ee386d73677d\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.470291 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-credential-keys\") pod \"9232407d-077b-4ed6-9350-ee386d73677d\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.470335 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lcvp\" (UniqueName: \"kubernetes.io/projected/9232407d-077b-4ed6-9350-ee386d73677d-kube-api-access-6lcvp\") pod \"9232407d-077b-4ed6-9350-ee386d73677d\" (UID: \"9232407d-077b-4ed6-9350-ee386d73677d\") " Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.474851 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9232407d-077b-4ed6-9350-ee386d73677d" (UID: "9232407d-077b-4ed6-9350-ee386d73677d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.475156 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9232407d-077b-4ed6-9350-ee386d73677d-kube-api-access-6lcvp" (OuterVolumeSpecName: "kube-api-access-6lcvp") pod "9232407d-077b-4ed6-9350-ee386d73677d" (UID: "9232407d-077b-4ed6-9350-ee386d73677d"). InnerVolumeSpecName "kube-api-access-6lcvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.476287 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9232407d-077b-4ed6-9350-ee386d73677d" (UID: "9232407d-077b-4ed6-9350-ee386d73677d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.476921 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-scripts" (OuterVolumeSpecName: "scripts") pod "9232407d-077b-4ed6-9350-ee386d73677d" (UID: "9232407d-077b-4ed6-9350-ee386d73677d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.487118 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-config-data" (OuterVolumeSpecName: "config-data") pod "9232407d-077b-4ed6-9350-ee386d73677d" (UID: "9232407d-077b-4ed6-9350-ee386d73677d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.571769 4970 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.571814 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lcvp\" (UniqueName: \"kubernetes.io/projected/9232407d-077b-4ed6-9350-ee386d73677d-kube-api-access-6lcvp\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.571826 4970 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.571834 4970 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.571844 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9232407d-077b-4ed6-9350-ee386d73677d-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.611347 4970 generic.go:334] "Generic (PLEG): container finished" podID="9232407d-077b-4ed6-9350-ee386d73677d" containerID="bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d" exitCode=0 Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.611382 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" event={"ID":"9232407d-077b-4ed6-9350-ee386d73677d","Type":"ContainerDied","Data":"bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d"} Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.611534 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" event={"ID":"9232407d-077b-4ed6-9350-ee386d73677d","Type":"ContainerDied","Data":"06b2d2fafa3a2f6b6af8874eafc901c3d588e1bb3ea4adf8c68026dbc8fdf693"} Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.611434 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7f84979c99-7g2v7" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.611609 4970 scope.go:117] "RemoveContainer" containerID="bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.616751 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" event={"ID":"2207ac08-790e-4eff-83cb-82a7b52344ff","Type":"ContainerDied","Data":"e6d3042c13281adb0e3519621c6241aa3c7c52c2fab358c0a7f21048172b2651"} Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.616784 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone85fc-account-delete-8nhcl" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.616793 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d3042c13281adb0e3519621c6241aa3c7c52c2fab358c0a7f21048172b2651" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.634940 4970 scope.go:117] "RemoveContainer" containerID="bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d" Nov 28 13:52:24 crc kubenswrapper[4970]: E1128 13:52:24.635584 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d\": container with ID starting with bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d not found: ID does not exist" containerID="bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.635622 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d"} err="failed to get container status \"bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d\": rpc error: code = NotFound desc = could not find container \"bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d\": container with ID starting with bd25509b7b4977c0e5153720c068be9c128e3c06683781a962c067b244b1143d not found: ID does not exist" Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.648770 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7f84979c99-7g2v7"] Nov 28 13:52:24 crc kubenswrapper[4970]: I1128 13:52:24.653174 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7f84979c99-7g2v7"] Nov 28 13:52:25 crc kubenswrapper[4970]: I1128 13:52:25.393426 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9232407d-077b-4ed6-9350-ee386d73677d" path="/var/lib/kubelet/pods/9232407d-077b-4ed6-9350-ee386d73677d/volumes" Nov 28 13:52:25 crc kubenswrapper[4970]: I1128 13:52:25.950991 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone85fc-account-delete-8nhcl"] Nov 28 13:52:25 crc kubenswrapper[4970]: I1128 13:52:25.956368 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone85fc-account-delete-8nhcl"] Nov 28 13:52:27 crc kubenswrapper[4970]: I1128 13:52:27.388484 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2207ac08-790e-4eff-83cb-82a7b52344ff" path="/var/lib/kubelet/pods/2207ac08-790e-4eff-83cb-82a7b52344ff/volumes" Nov 28 13:52:40 crc kubenswrapper[4970]: I1128 13:52:40.274700 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:52:40 crc kubenswrapper[4970]: I1128 13:52:40.281499 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:52:40 crc kubenswrapper[4970]: I1128 13:52:40.286149 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:52:40 crc kubenswrapper[4970]: I1128 13:52:40.437481 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-2" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="galera" containerID="cri-o://b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde" gracePeriod=30 Nov 28 13:52:41 crc kubenswrapper[4970]: I1128 13:52:41.462355 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:52:41 crc kubenswrapper[4970]: I1128 13:52:41.462965 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/memcached-0" podUID="b351352a-c436-4df2-9d43-f7dde4bb6a8a" containerName="memcached" containerID="cri-o://d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a" gracePeriod=30 Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.125919 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.343176 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.437947 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-kolla-config\") pod \"70137649-04fe-46dd-94ef-03a6ab19aecd\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.437994 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-default\") pod \"70137649-04fe-46dd-94ef-03a6ab19aecd\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.438029 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg244\" (UniqueName: \"kubernetes.io/projected/70137649-04fe-46dd-94ef-03a6ab19aecd-kube-api-access-fg244\") pod \"70137649-04fe-46dd-94ef-03a6ab19aecd\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.438050 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"70137649-04fe-46dd-94ef-03a6ab19aecd\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.438092 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-generated\") pod \"70137649-04fe-46dd-94ef-03a6ab19aecd\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.438141 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-operator-scripts\") pod \"70137649-04fe-46dd-94ef-03a6ab19aecd\" (UID: \"70137649-04fe-46dd-94ef-03a6ab19aecd\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.438500 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "70137649-04fe-46dd-94ef-03a6ab19aecd" (UID: "70137649-04fe-46dd-94ef-03a6ab19aecd"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.438565 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "70137649-04fe-46dd-94ef-03a6ab19aecd" (UID: "70137649-04fe-46dd-94ef-03a6ab19aecd"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.439210 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "70137649-04fe-46dd-94ef-03a6ab19aecd" (UID: "70137649-04fe-46dd-94ef-03a6ab19aecd"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.439522 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70137649-04fe-46dd-94ef-03a6ab19aecd" (UID: "70137649-04fe-46dd-94ef-03a6ab19aecd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.445208 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70137649-04fe-46dd-94ef-03a6ab19aecd-kube-api-access-fg244" (OuterVolumeSpecName: "kube-api-access-fg244") pod "70137649-04fe-46dd-94ef-03a6ab19aecd" (UID: "70137649-04fe-46dd-94ef-03a6ab19aecd"). InnerVolumeSpecName "kube-api-access-fg244". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.448876 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-1" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerName="galera" containerID="cri-o://22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79" gracePeriod=28 Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.453426 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "70137649-04fe-46dd-94ef-03a6ab19aecd" (UID: "70137649-04fe-46dd-94ef-03a6ab19aecd"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.492325 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.539274 4970 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.539310 4970 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.539321 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg244\" (UniqueName: \"kubernetes.io/projected/70137649-04fe-46dd-94ef-03a6ab19aecd-kube-api-access-fg244\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.539352 4970 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.539363 4970 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/70137649-04fe-46dd-94ef-03a6ab19aecd-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.539372 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70137649-04fe-46dd-94ef-03a6ab19aecd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.549139 4970 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.640487 4970 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.712508 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.741873 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds9v5\" (UniqueName: \"kubernetes.io/projected/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kube-api-access-ds9v5\") pod \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.742012 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-config-data\") pod \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.742051 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kolla-config\") pod \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\" (UID: \"b351352a-c436-4df2-9d43-f7dde4bb6a8a\") " Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.742653 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b351352a-c436-4df2-9d43-f7dde4bb6a8a" (UID: "b351352a-c436-4df2-9d43-f7dde4bb6a8a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.742990 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-config-data" (OuterVolumeSpecName: "config-data") pod "b351352a-c436-4df2-9d43-f7dde4bb6a8a" (UID: "b351352a-c436-4df2-9d43-f7dde4bb6a8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.785173 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kube-api-access-ds9v5" (OuterVolumeSpecName: "kube-api-access-ds9v5") pod "b351352a-c436-4df2-9d43-f7dde4bb6a8a" (UID: "b351352a-c436-4df2-9d43-f7dde4bb6a8a"). InnerVolumeSpecName "kube-api-access-ds9v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.791178 4970 generic.go:334] "Generic (PLEG): container finished" podID="b351352a-c436-4df2-9d43-f7dde4bb6a8a" containerID="d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a" exitCode=0 Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.791230 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.791278 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b351352a-c436-4df2-9d43-f7dde4bb6a8a","Type":"ContainerDied","Data":"d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a"} Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.791309 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"b351352a-c436-4df2-9d43-f7dde4bb6a8a","Type":"ContainerDied","Data":"b2e066cac3acf317439141be0b6c32251c9409762ccb261a91bc49eafd7b8ae5"} Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.791328 4970 scope.go:117] "RemoveContainer" containerID="d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.794467 4970 generic.go:334] "Generic (PLEG): container finished" podID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerID="b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde" exitCode=0 Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.794749 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.795421 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"70137649-04fe-46dd-94ef-03a6ab19aecd","Type":"ContainerDied","Data":"b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde"} Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.795453 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"70137649-04fe-46dd-94ef-03a6ab19aecd","Type":"ContainerDied","Data":"c3384bb2c9f8f9bc54e7a5e36e7262765934d55b67e7f922ed1d5801b4726eed"} Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.834643 4970 scope.go:117] "RemoveContainer" containerID="d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a" Nov 28 13:52:42 crc kubenswrapper[4970]: E1128 13:52:42.838367 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a\": container with ID starting with d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a not found: ID does not exist" containerID="d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.838412 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a"} err="failed to get container status \"d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a\": rpc error: code = NotFound desc = could not find container \"d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a\": container with ID starting with d91e788ea7590bdafe486d27d197219035b40a1b36c00192819e88eb4b37295a not found: ID does not exist" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.838436 4970 scope.go:117] "RemoveContainer" containerID="b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.843005 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds9v5\" (UniqueName: \"kubernetes.io/projected/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kube-api-access-ds9v5\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.843043 4970 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-config-data\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.843053 4970 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b351352a-c436-4df2-9d43-f7dde4bb6a8a-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.861958 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="rabbitmq" containerID="cri-o://d34959bfbdbb8c13b83ef8e984b0cf24e5f22ab99a9594c089aeea056517a236" gracePeriod=604800 Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.871885 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.879198 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.913408 4970 scope.go:117] "RemoveContainer" containerID="1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.921390 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.933739 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.948065 4970 scope.go:117] "RemoveContainer" containerID="b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde" Nov 28 13:52:42 crc kubenswrapper[4970]: E1128 13:52:42.948730 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde\": container with ID starting with b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde not found: ID does not exist" containerID="b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.948766 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde"} err="failed to get container status \"b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde\": rpc error: code = NotFound desc = could not find container \"b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde\": container with ID starting with b48fb2d166b766f82ae81402f8f0b0f54e5d99a322c3e2b505036c8baaeeffde not found: ID does not exist" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.948787 4970 scope.go:117] "RemoveContainer" containerID="1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9" Nov 28 13:52:42 crc kubenswrapper[4970]: E1128 13:52:42.949113 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9\": container with ID starting with 1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9 not found: ID does not exist" containerID="1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9" Nov 28 13:52:42 crc kubenswrapper[4970]: I1128 13:52:42.949129 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9"} err="failed to get container status \"1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9\": rpc error: code = NotFound desc = could not find container \"1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9\": container with ID starting with 1071d951f3093c2095f7c2818f8fa07d75896e34c9ccc20ee0dd72e86a64d0a9 not found: ID does not exist" Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.387674 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" path="/var/lib/kubelet/pods/70137649-04fe-46dd-94ef-03a6ab19aecd/volumes" Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.388293 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b351352a-c436-4df2-9d43-f7dde4bb6a8a" path="/var/lib/kubelet/pods/b351352a-c436-4df2-9d43-f7dde4bb6a8a/volumes" Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.411690 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l"] Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.411910 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" podUID="8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" containerName="manager" containerID="cri-o://3ec6975cc83d8f66892a684775bf8fc026283124c64a2fef05829db449116928" gracePeriod=10 Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.668342 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-d6xcn"] Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.668893 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-d6xcn" podUID="a65f142b-ad4e-4901-9ca1-8d27e66fc59c" containerName="registry-server" containerID="cri-o://3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2" gracePeriod=30 Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.684061 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs"] Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.698253 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/0f0d088e5b30b817f0b379848316eaf0cb8438002c589713c8ae2315c35bggs"] Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.806171 4970 generic.go:334] "Generic (PLEG): container finished" podID="8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" containerID="3ec6975cc83d8f66892a684775bf8fc026283124c64a2fef05829db449116928" exitCode=0 Nov 28 13:52:43 crc kubenswrapper[4970]: I1128 13:52:43.806264 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" event={"ID":"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0","Type":"ContainerDied","Data":"3ec6975cc83d8f66892a684775bf8fc026283124c64a2fef05829db449116928"} Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.418737 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.447649 4970 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.72:5672: connect: connection refused" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.503838 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-0" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerName="galera" containerID="cri-o://0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359" gracePeriod=26 Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.569057 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-apiservice-cert\") pod \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.569157 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-webhook-cert\") pod \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.569185 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7b7k\" (UniqueName: \"kubernetes.io/projected/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-kube-api-access-b7b7k\") pod \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\" (UID: \"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0\") " Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.573912 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" (UID: "8bcf5bd0-9824-4c40-b009-8f2e50ad08b0"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.574325 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" (UID: "8bcf5bd0-9824-4c40-b009-8f2e50ad08b0"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.589483 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-kube-api-access-b7b7k" (OuterVolumeSpecName: "kube-api-access-b7b7k") pod "8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" (UID: "8bcf5bd0-9824-4c40-b009-8f2e50ad08b0"). InnerVolumeSpecName "kube-api-access-b7b7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.603323 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.670910 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7b7k\" (UniqueName: \"kubernetes.io/projected/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-kube-api-access-b7b7k\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.670967 4970 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.670982 4970 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.771652 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvl6w\" (UniqueName: \"kubernetes.io/projected/a65f142b-ad4e-4901-9ca1-8d27e66fc59c-kube-api-access-bvl6w\") pod \"a65f142b-ad4e-4901-9ca1-8d27e66fc59c\" (UID: \"a65f142b-ad4e-4901-9ca1-8d27e66fc59c\") " Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.774262 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65f142b-ad4e-4901-9ca1-8d27e66fc59c-kube-api-access-bvl6w" (OuterVolumeSpecName: "kube-api-access-bvl6w") pod "a65f142b-ad4e-4901-9ca1-8d27e66fc59c" (UID: "a65f142b-ad4e-4901-9ca1-8d27e66fc59c"). InnerVolumeSpecName "kube-api-access-bvl6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.815824 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" event={"ID":"8bcf5bd0-9824-4c40-b009-8f2e50ad08b0","Type":"ContainerDied","Data":"71ec5298b5cae377ff0092b1bd7f6809b43768e99329ffb1ec0266316a6c464d"} Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.815887 4970 scope.go:117] "RemoveContainer" containerID="3ec6975cc83d8f66892a684775bf8fc026283124c64a2fef05829db449116928" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.816020 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.823337 4970 generic.go:334] "Generic (PLEG): container finished" podID="a65f142b-ad4e-4901-9ca1-8d27e66fc59c" containerID="3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2" exitCode=0 Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.823377 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-d6xcn" event={"ID":"a65f142b-ad4e-4901-9ca1-8d27e66fc59c","Type":"ContainerDied","Data":"3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2"} Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.823393 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-d6xcn" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.823402 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-d6xcn" event={"ID":"a65f142b-ad4e-4901-9ca1-8d27e66fc59c","Type":"ContainerDied","Data":"8475e5ee79ea457d3c68aa18543a573a86df620497bc9434436e2abdb5aa84d3"} Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.839101 4970 scope.go:117] "RemoveContainer" containerID="3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.844477 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l"] Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.851147 4970 scope.go:117] "RemoveContainer" containerID="3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.851401 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7456869864-hwf9l"] Nov 28 13:52:44 crc kubenswrapper[4970]: E1128 13:52:44.851535 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2\": container with ID starting with 3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2 not found: ID does not exist" containerID="3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.851565 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2"} err="failed to get container status \"3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2\": rpc error: code = NotFound desc = could not find container \"3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2\": container with ID starting with 3033b684932cb40bcd30965974670650b1d5091eea5dfee3c188fa9c5e33baf2 not found: ID does not exist" Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.858582 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-d6xcn"] Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.862815 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-d6xcn"] Nov 28 13:52:44 crc kubenswrapper[4970]: I1128 13:52:44.873822 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvl6w\" (UniqueName: \"kubernetes.io/projected/a65f142b-ad4e-4901-9ca1-8d27e66fc59c-kube-api-access-bvl6w\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.387451 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" path="/var/lib/kubelet/pods/8bcf5bd0-9824-4c40-b009-8f2e50ad08b0/volumes" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.388127 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65f142b-ad4e-4901-9ca1-8d27e66fc59c" path="/var/lib/kubelet/pods/a65f142b-ad4e-4901-9ca1-8d27e66fc59c/volumes" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.389119 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe3a2fc-2688-413f-b4e1-9ba678488f30" path="/var/lib/kubelet/pods/cbe3a2fc-2688-413f-b4e1-9ba678488f30/volumes" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.728806 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.781933 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.831462 4970 generic.go:334] "Generic (PLEG): container finished" podID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerID="22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79" exitCode=0 Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.831595 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.831735 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"3a4491a2-79c8-4e5b-8f2f-6c8182f09885","Type":"ContainerDied","Data":"22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79"} Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.831806 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"3a4491a2-79c8-4e5b-8f2f-6c8182f09885","Type":"ContainerDied","Data":"9378584313b1e286fd6e0e0bf88cc692e44a29ea5a897d1e83aeb1e01904834a"} Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.831840 4970 scope.go:117] "RemoveContainer" containerID="22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.835747 4970 generic.go:334] "Generic (PLEG): container finished" podID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerID="0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359" exitCode=0 Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.835800 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1474c5bc-29c4-4da3-b2e9-900196941f19","Type":"ContainerDied","Data":"0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359"} Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.835824 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1474c5bc-29c4-4da3-b2e9-900196941f19","Type":"ContainerDied","Data":"809fbf33122f8f03413edde274e24bfe122a29774186624f0ca7d1c7782f2c76"} Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.835875 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.843615 4970 generic.go:334] "Generic (PLEG): container finished" podID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerID="d34959bfbdbb8c13b83ef8e984b0cf24e5f22ab99a9594c089aeea056517a236" exitCode=0 Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.843665 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"dfa2f2ae-c626-4fd8-a04c-a762e271a467","Type":"ContainerDied","Data":"d34959bfbdbb8c13b83ef8e984b0cf24e5f22ab99a9594c089aeea056517a236"} Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.858732 4970 scope.go:117] "RemoveContainer" containerID="d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.895207 4970 scope.go:117] "RemoveContainer" containerID="22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79" Nov 28 13:52:45 crc kubenswrapper[4970]: E1128 13:52:45.899376 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79\": container with ID starting with 22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79 not found: ID does not exist" containerID="22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.899423 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79"} err="failed to get container status \"22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79\": rpc error: code = NotFound desc = could not find container \"22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79\": container with ID starting with 22e6da56be8ccc2e55abefbafa87c681b32b59e74b0592edffe81f01886f9d79 not found: ID does not exist" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.899449 4970 scope.go:117] "RemoveContainer" containerID="d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.900852 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-kolla-config\") pod \"1474c5bc-29c4-4da3-b2e9-900196941f19\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.900944 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-generated\") pod \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.900988 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-default\") pod \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901018 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901063 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1474c5bc-29c4-4da3-b2e9-900196941f19\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901086 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-operator-scripts\") pod \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901120 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7kf\" (UniqueName: \"kubernetes.io/projected/1474c5bc-29c4-4da3-b2e9-900196941f19-kube-api-access-ck7kf\") pod \"1474c5bc-29c4-4da3-b2e9-900196941f19\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901146 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kolla-config\") pod \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901170 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5cz5\" (UniqueName: \"kubernetes.io/projected/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kube-api-access-n5cz5\") pod \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\" (UID: \"3a4491a2-79c8-4e5b-8f2f-6c8182f09885\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901229 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-default\") pod \"1474c5bc-29c4-4da3-b2e9-900196941f19\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901254 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-operator-scripts\") pod \"1474c5bc-29c4-4da3-b2e9-900196941f19\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901276 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-generated\") pod \"1474c5bc-29c4-4da3-b2e9-900196941f19\" (UID: \"1474c5bc-29c4-4da3-b2e9-900196941f19\") " Nov 28 13:52:45 crc kubenswrapper[4970]: E1128 13:52:45.901493 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d\": container with ID starting with d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d not found: ID does not exist" containerID="d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901586 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d"} err="failed to get container status \"d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d\": rpc error: code = NotFound desc = could not find container \"d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d\": container with ID starting with d12f8faa293ffb2e3f55fb029a4b7998534bf5be5c47fd5bc159d5558ceb280d not found: ID does not exist" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901672 4970 scope.go:117] "RemoveContainer" containerID="0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901701 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3a4491a2-79c8-4e5b-8f2f-6c8182f09885" (UID: "3a4491a2-79c8-4e5b-8f2f-6c8182f09885"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901809 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "1474c5bc-29c4-4da3-b2e9-900196941f19" (UID: "1474c5bc-29c4-4da3-b2e9-900196941f19"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.901834 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1474c5bc-29c4-4da3-b2e9-900196941f19" (UID: "1474c5bc-29c4-4da3-b2e9-900196941f19"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.902169 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3a4491a2-79c8-4e5b-8f2f-6c8182f09885" (UID: "3a4491a2-79c8-4e5b-8f2f-6c8182f09885"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.902717 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a4491a2-79c8-4e5b-8f2f-6c8182f09885" (UID: "3a4491a2-79c8-4e5b-8f2f-6c8182f09885"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.903009 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3a4491a2-79c8-4e5b-8f2f-6c8182f09885" (UID: "3a4491a2-79c8-4e5b-8f2f-6c8182f09885"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.903614 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1474c5bc-29c4-4da3-b2e9-900196941f19" (UID: "1474c5bc-29c4-4da3-b2e9-900196941f19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.904876 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "1474c5bc-29c4-4da3-b2e9-900196941f19" (UID: "1474c5bc-29c4-4da3-b2e9-900196941f19"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.907671 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kube-api-access-n5cz5" (OuterVolumeSpecName: "kube-api-access-n5cz5") pod "3a4491a2-79c8-4e5b-8f2f-6c8182f09885" (UID: "3a4491a2-79c8-4e5b-8f2f-6c8182f09885"). InnerVolumeSpecName "kube-api-access-n5cz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.907844 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1474c5bc-29c4-4da3-b2e9-900196941f19-kube-api-access-ck7kf" (OuterVolumeSpecName: "kube-api-access-ck7kf") pod "1474c5bc-29c4-4da3-b2e9-900196941f19" (UID: "1474c5bc-29c4-4da3-b2e9-900196941f19"). InnerVolumeSpecName "kube-api-access-ck7kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.912266 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "3a4491a2-79c8-4e5b-8f2f-6c8182f09885" (UID: "3a4491a2-79c8-4e5b-8f2f-6c8182f09885"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.912442 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "1474c5bc-29c4-4da3-b2e9-900196941f19" (UID: "1474c5bc-29c4-4da3-b2e9-900196941f19"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.937612 4970 scope.go:117] "RemoveContainer" containerID="7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.958921 4970 scope.go:117] "RemoveContainer" containerID="0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359" Nov 28 13:52:45 crc kubenswrapper[4970]: E1128 13:52:45.959344 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359\": container with ID starting with 0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359 not found: ID does not exist" containerID="0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.959394 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359"} err="failed to get container status \"0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359\": rpc error: code = NotFound desc = could not find container \"0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359\": container with ID starting with 0817c9c6a5a83728037407a7288119e8e5aa21266704002fa91de53268569359 not found: ID does not exist" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.959423 4970 scope.go:117] "RemoveContainer" containerID="7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b" Nov 28 13:52:45 crc kubenswrapper[4970]: E1128 13:52:45.959822 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b\": container with ID starting with 7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b not found: ID does not exist" containerID="7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b" Nov 28 13:52:45 crc kubenswrapper[4970]: I1128 13:52:45.959853 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b"} err="failed to get container status \"7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b\": rpc error: code = NotFound desc = could not find container \"7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b\": container with ID starting with 7781a0ed5aead47f5ad959cf42fe8393ca0218eaabfa3e04b3d3565eac7ce46b not found: ID does not exist" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002692 4970 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002739 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002751 4970 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1474c5bc-29c4-4da3-b2e9-900196941f19-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002765 4970 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1474c5bc-29c4-4da3-b2e9-900196941f19-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002777 4970 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002788 4970 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002840 4970 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002858 4970 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002869 4970 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002881 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck7kf\" (UniqueName: \"kubernetes.io/projected/1474c5bc-29c4-4da3-b2e9-900196941f19-kube-api-access-ck7kf\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002892 4970 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.002904 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5cz5\" (UniqueName: \"kubernetes.io/projected/3a4491a2-79c8-4e5b-8f2f-6c8182f09885-kube-api-access-n5cz5\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.014128 4970 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.015890 4970 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.103669 4970 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.103700 4970 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.161012 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.165414 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.175417 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.182163 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.319427 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.319744 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="manager" containerID="cri-o://26166f82d6294755ed8cbbbf2de0a966eac1347f297da23ff2d500d91152689a" gracePeriod=10 Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.320070 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="kube-rbac-proxy" containerID="cri-o://fcdc7b828b38f9346c1685bad3df9047ffb317c10c55ae7e67b569220f83b16e" gracePeriod=10 Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.642469 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-jfj76"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.642985 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-jfj76" podUID="defb0064-6731-4f52-872a-a26d8e82dd41" containerName="registry-server" containerID="cri-o://5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f" gracePeriod=30 Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.676656 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.682142 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dlwds2"] Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.802622 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.850948 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"dfa2f2ae-c626-4fd8-a04c-a762e271a467","Type":"ContainerDied","Data":"0ed3cb1adb1799e5db85e4864dc08df31995ccf37f8a80a413836ad153021cc4"} Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.850992 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.851008 4970 scope.go:117] "RemoveContainer" containerID="d34959bfbdbb8c13b83ef8e984b0cf24e5f22ab99a9594c089aeea056517a236" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.867459 4970 scope.go:117] "RemoveContainer" containerID="49d7a49f1b55c97d9b5689058fdb75d79f956f69e116755a84ab943c4e4da4ed" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.913896 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfa2f2ae-c626-4fd8-a04c-a762e271a467-erlang-cookie-secret\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.913939 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-plugins\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.913965 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfa2f2ae-c626-4fd8-a04c-a762e271a467-pod-info\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.914018 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p4d4\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-kube-api-access-2p4d4\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.914103 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-confd\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.914132 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-erlang-cookie\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.914385 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.914426 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfa2f2ae-c626-4fd8-a04c-a762e271a467-plugins-conf\") pod \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\" (UID: \"dfa2f2ae-c626-4fd8-a04c-a762e271a467\") " Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.914867 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa2f2ae-c626-4fd8-a04c-a762e271a467-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.915101 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.915108 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.917885 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-kube-api-access-2p4d4" (OuterVolumeSpecName: "kube-api-access-2p4d4") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "kube-api-access-2p4d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.918074 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dfa2f2ae-c626-4fd8-a04c-a762e271a467-pod-info" (OuterVolumeSpecName: "pod-info") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.922559 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa2f2ae-c626-4fd8-a04c-a762e271a467-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.926561 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582" (OuterVolumeSpecName: "persistence") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "pvc-170f589f-095f-44f6-a9f9-fad686c8f582". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 28 13:52:46 crc kubenswrapper[4970]: I1128 13:52:46.989384 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dfa2f2ae-c626-4fd8-a04c-a762e271a467" (UID: "dfa2f2ae-c626-4fd8-a04c-a762e271a467"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017836 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p4d4\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-kube-api-access-2p4d4\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017882 4970 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017898 4970 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017942 4970 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-170f589f-095f-44f6-a9f9-fad686c8f582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") on node \"crc\" " Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017958 4970 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfa2f2ae-c626-4fd8-a04c-a762e271a467-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017972 4970 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfa2f2ae-c626-4fd8-a04c-a762e271a467-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.017988 4970 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfa2f2ae-c626-4fd8-a04c-a762e271a467-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.018000 4970 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfa2f2ae-c626-4fd8-a04c-a762e271a467-pod-info\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.035496 4970 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.035695 4970 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-170f589f-095f-44f6-a9f9-fad686c8f582" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582") on node "crc" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.118865 4970 reconciler_common.go:293] "Volume detached for volume \"pvc-170f589f-095f-44f6-a9f9-fad686c8f582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-170f589f-095f-44f6-a9f9-fad686c8f582\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.184300 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.190354 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.411436 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" path="/var/lib/kubelet/pods/1474c5bc-29c4-4da3-b2e9-900196941f19/volumes" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.412044 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" path="/var/lib/kubelet/pods/3a4491a2-79c8-4e5b-8f2f-6c8182f09885/volumes" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.413832 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" path="/var/lib/kubelet/pods/dfa2f2ae-c626-4fd8-a04c-a762e271a467/volumes" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.414477 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e958b4be-5525-4751-a2e5-feecdea9d82c" path="/var/lib/kubelet/pods/e958b4be-5525-4751-a2e5-feecdea9d82c/volumes" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.651313 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.826488 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv47p\" (UniqueName: \"kubernetes.io/projected/defb0064-6731-4f52-872a-a26d8e82dd41-kube-api-access-nv47p\") pod \"defb0064-6731-4f52-872a-a26d8e82dd41\" (UID: \"defb0064-6731-4f52-872a-a26d8e82dd41\") " Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.830481 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defb0064-6731-4f52-872a-a26d8e82dd41-kube-api-access-nv47p" (OuterVolumeSpecName: "kube-api-access-nv47p") pod "defb0064-6731-4f52-872a-a26d8e82dd41" (UID: "defb0064-6731-4f52-872a-a26d8e82dd41"). InnerVolumeSpecName "kube-api-access-nv47p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.861825 4970 generic.go:334] "Generic (PLEG): container finished" podID="defb0064-6731-4f52-872a-a26d8e82dd41" containerID="5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f" exitCode=0 Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.861911 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-jfj76" event={"ID":"defb0064-6731-4f52-872a-a26d8e82dd41","Type":"ContainerDied","Data":"5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f"} Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.861981 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-jfj76" event={"ID":"defb0064-6731-4f52-872a-a26d8e82dd41","Type":"ContainerDied","Data":"dab6ba14e2048c2b6cdf88d408edee3b4a19d3ce8e33e151f4823f686a70bd96"} Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.861932 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-jfj76" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.862014 4970 scope.go:117] "RemoveContainer" containerID="5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.865330 4970 generic.go:334] "Generic (PLEG): container finished" podID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerID="fcdc7b828b38f9346c1685bad3df9047ffb317c10c55ae7e67b569220f83b16e" exitCode=0 Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.865358 4970 generic.go:334] "Generic (PLEG): container finished" podID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerID="26166f82d6294755ed8cbbbf2de0a966eac1347f297da23ff2d500d91152689a" exitCode=0 Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.865379 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" event={"ID":"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d","Type":"ContainerDied","Data":"fcdc7b828b38f9346c1685bad3df9047ffb317c10c55ae7e67b569220f83b16e"} Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.865398 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" event={"ID":"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d","Type":"ContainerDied","Data":"26166f82d6294755ed8cbbbf2de0a966eac1347f297da23ff2d500d91152689a"} Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.879341 4970 scope.go:117] "RemoveContainer" containerID="5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f" Nov 28 13:52:47 crc kubenswrapper[4970]: E1128 13:52:47.879649 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f\": container with ID starting with 5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f not found: ID does not exist" containerID="5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.879701 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f"} err="failed to get container status \"5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f\": rpc error: code = NotFound desc = could not find container \"5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f\": container with ID starting with 5bb08c62fb4b63a822e60972fd50f5b3cf4793a9a31f02e4308380658fcf1a3f not found: ID does not exist" Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.886476 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-jfj76"] Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.893897 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-jfj76"] Nov 28 13:52:47 crc kubenswrapper[4970]: I1128 13:52:47.928298 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv47p\" (UniqueName: \"kubernetes.io/projected/defb0064-6731-4f52-872a-a26d8e82dd41-kube-api-access-nv47p\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.107403 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p"] Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.107595 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" podUID="4729d0a5-be5a-4c85-83eb-f213d31f3755" containerName="manager" containerID="cri-o://14802a15c862493a6fb83e83a11d6ce21c0646bffbd8ac711da3252e2b2ace6e" gracePeriod=10 Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.329483 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.417727 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-w2zqv"] Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.418033 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-w2zqv" podUID="9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" containerName="registry-server" containerID="cri-o://6d6e8d2c141c112a6b3b522c25558a73900d98b7f358a55f70b227ac09cc4b29" gracePeriod=30 Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.433272 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-apiservice-cert\") pod \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.433590 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-webhook-cert\") pod \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.434014 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq2cm\" (UniqueName: \"kubernetes.io/projected/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-kube-api-access-zq2cm\") pod \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\" (UID: \"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d\") " Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.436406 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" (UID: "d8b95186-66e8-493a-8eb9-79e4cd5b5a7d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.437651 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-kube-api-access-zq2cm" (OuterVolumeSpecName: "kube-api-access-zq2cm") pod "d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" (UID: "d8b95186-66e8-493a-8eb9-79e4cd5b5a7d"). InnerVolumeSpecName "kube-api-access-zq2cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.446907 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" (UID: "d8b95186-66e8-493a-8eb9-79e4cd5b5a7d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.455656 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs"] Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.459443 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534f92sbs"] Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.535807 4970 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.535852 4970 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.535864 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq2cm\" (UniqueName: \"kubernetes.io/projected/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d-kube-api-access-zq2cm\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.874246 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" event={"ID":"d8b95186-66e8-493a-8eb9-79e4cd5b5a7d","Type":"ContainerDied","Data":"e46e313c9050188c19cd9c042fa3abad000bb012f358d6d145ce1e1e3f07f09c"} Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.874271 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.874602 4970 scope.go:117] "RemoveContainer" containerID="fcdc7b828b38f9346c1685bad3df9047ffb317c10c55ae7e67b569220f83b16e" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.876959 4970 generic.go:334] "Generic (PLEG): container finished" podID="4729d0a5-be5a-4c85-83eb-f213d31f3755" containerID="14802a15c862493a6fb83e83a11d6ce21c0646bffbd8ac711da3252e2b2ace6e" exitCode=0 Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.877042 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" event={"ID":"4729d0a5-be5a-4c85-83eb-f213d31f3755","Type":"ContainerDied","Data":"14802a15c862493a6fb83e83a11d6ce21c0646bffbd8ac711da3252e2b2ace6e"} Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.890650 4970 scope.go:117] "RemoveContainer" containerID="26166f82d6294755ed8cbbbf2de0a966eac1347f297da23ff2d500d91152689a" Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.930675 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25"] Nov 28 13:52:48 crc kubenswrapper[4970]: I1128 13:52:48.935996 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c855bfbc4-jhn25"] Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.394710 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" path="/var/lib/kubelet/pods/d8b95186-66e8-493a-8eb9-79e4cd5b5a7d/volumes" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.395603 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defb0064-6731-4f52-872a-a26d8e82dd41" path="/var/lib/kubelet/pods/defb0064-6731-4f52-872a-a26d8e82dd41/volumes" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.396184 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9534406-ffb1-48fd-8589-8f5d4bac63a4" path="/var/lib/kubelet/pods/f9534406-ffb1-48fd-8589-8f5d4bac63a4/volumes" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.415801 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.548579 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-apiservice-cert\") pod \"4729d0a5-be5a-4c85-83eb-f213d31f3755\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.548664 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4wbk\" (UniqueName: \"kubernetes.io/projected/4729d0a5-be5a-4c85-83eb-f213d31f3755-kube-api-access-s4wbk\") pod \"4729d0a5-be5a-4c85-83eb-f213d31f3755\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.548704 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-webhook-cert\") pod \"4729d0a5-be5a-4c85-83eb-f213d31f3755\" (UID: \"4729d0a5-be5a-4c85-83eb-f213d31f3755\") " Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.556452 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "4729d0a5-be5a-4c85-83eb-f213d31f3755" (UID: "4729d0a5-be5a-4c85-83eb-f213d31f3755"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.559548 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4729d0a5-be5a-4c85-83eb-f213d31f3755-kube-api-access-s4wbk" (OuterVolumeSpecName: "kube-api-access-s4wbk") pod "4729d0a5-be5a-4c85-83eb-f213d31f3755" (UID: "4729d0a5-be5a-4c85-83eb-f213d31f3755"). InnerVolumeSpecName "kube-api-access-s4wbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.561420 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "4729d0a5-be5a-4c85-83eb-f213d31f3755" (UID: "4729d0a5-be5a-4c85-83eb-f213d31f3755"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.649562 4970 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.649598 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4wbk\" (UniqueName: \"kubernetes.io/projected/4729d0a5-be5a-4c85-83eb-f213d31f3755-kube-api-access-s4wbk\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.649609 4970 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4729d0a5-be5a-4c85-83eb-f213d31f3755-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.890575 4970 generic.go:334] "Generic (PLEG): container finished" podID="9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" containerID="6d6e8d2c141c112a6b3b522c25558a73900d98b7f358a55f70b227ac09cc4b29" exitCode=0 Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.891016 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-w2zqv" event={"ID":"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3","Type":"ContainerDied","Data":"6d6e8d2c141c112a6b3b522c25558a73900d98b7f358a55f70b227ac09cc4b29"} Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.893828 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.893872 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p" event={"ID":"4729d0a5-be5a-4c85-83eb-f213d31f3755","Type":"ContainerDied","Data":"68c76262a55f624e75e6fbaf2ceae091a1c8831d91f622ecc75a1d733ac38d97"} Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.893931 4970 scope.go:117] "RemoveContainer" containerID="14802a15c862493a6fb83e83a11d6ce21c0646bffbd8ac711da3252e2b2ace6e" Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.938582 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p"] Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.941759 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7bdb74f78d-ct66p"] Nov 28 13:52:49 crc kubenswrapper[4970]: I1128 13:52:49.974513 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.153993 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkgvh\" (UniqueName: \"kubernetes.io/projected/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3-kube-api-access-vkgvh\") pod \"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3\" (UID: \"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3\") " Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.160319 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3-kube-api-access-vkgvh" (OuterVolumeSpecName: "kube-api-access-vkgvh") pod "9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" (UID: "9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3"). InnerVolumeSpecName "kube-api-access-vkgvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.255552 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkgvh\" (UniqueName: \"kubernetes.io/projected/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3-kube-api-access-vkgvh\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.831863 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn"] Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.832049 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" podUID="4e835339-2c2a-4195-a2b9-d76a7741f412" containerName="operator" containerID="cri-o://733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" gracePeriod=10 Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.905638 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-w2zqv" event={"ID":"9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3","Type":"ContainerDied","Data":"b8279e1ad2c372acf6a50bd69758dc106a33f8c7c049b720ed4f3d98ff529639"} Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.905713 4970 scope.go:117] "RemoveContainer" containerID="6d6e8d2c141c112a6b3b522c25558a73900d98b7f358a55f70b227ac09cc4b29" Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.905668 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-w2zqv" Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.939292 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-w2zqv"] Nov 28 13:52:50 crc kubenswrapper[4970]: I1128 13:52:50.942766 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-w2zqv"] Nov 28 13:52:50 crc kubenswrapper[4970]: E1128 13:52:50.999131 4970 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1382f3_12a1_4d6a_84ce_8ca5fb915aa3.slice/crio-b8279e1ad2c372acf6a50bd69758dc106a33f8c7c049b720ed4f3d98ff529639\": RecentStats: unable to find data in memory cache]" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.089335 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-kjd76"] Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.089845 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" podUID="a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" containerName="registry-server" containerID="cri-o://bd0a1d0b1c41038773d89480fb84f837c99ab547ebacd02a7ca87bbd3cc5fccc" gracePeriod=30 Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.112848 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f"] Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.116216 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5906mn5f"] Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.333848 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.333894 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.361127 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.387364 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a" path="/var/lib/kubelet/pods/0837c7a8-a61d-44b1-9f7d-ad3a758a0b9a/volumes" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.388068 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4729d0a5-be5a-4c85-83eb-f213d31f3755" path="/var/lib/kubelet/pods/4729d0a5-be5a-4c85-83eb-f213d31f3755/volumes" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.388560 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" path="/var/lib/kubelet/pods/9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3/volumes" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.474474 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4k4x\" (UniqueName: \"kubernetes.io/projected/4e835339-2c2a-4195-a2b9-d76a7741f412-kube-api-access-k4k4x\") pod \"4e835339-2c2a-4195-a2b9-d76a7741f412\" (UID: \"4e835339-2c2a-4195-a2b9-d76a7741f412\") " Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.480998 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e835339-2c2a-4195-a2b9-d76a7741f412-kube-api-access-k4k4x" (OuterVolumeSpecName: "kube-api-access-k4k4x") pod "4e835339-2c2a-4195-a2b9-d76a7741f412" (UID: "4e835339-2c2a-4195-a2b9-d76a7741f412"). InnerVolumeSpecName "kube-api-access-k4k4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.576029 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4k4x\" (UniqueName: \"kubernetes.io/projected/4e835339-2c2a-4195-a2b9-d76a7741f412-kube-api-access-k4k4x\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.869611 4970 scope.go:117] "RemoveContainer" containerID="779f4a67b747c4c45c7bea822c9931534849e1f5483d9ce33f185e777a205d73" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.890190 4970 scope.go:117] "RemoveContainer" containerID="7505a745e05d58ede03f331bb0333008a0cbdf23e25c01b4f6b3b7f0150d816a" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.916177 4970 scope.go:117] "RemoveContainer" containerID="168e3dadacb99c825ae65b522ee81165a4066f0f04f3d1c844ddbe57e6ec113a" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.925759 4970 generic.go:334] "Generic (PLEG): container finished" podID="a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" containerID="bd0a1d0b1c41038773d89480fb84f837c99ab547ebacd02a7ca87bbd3cc5fccc" exitCode=0 Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.925835 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" event={"ID":"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394","Type":"ContainerDied","Data":"bd0a1d0b1c41038773d89480fb84f837c99ab547ebacd02a7ca87bbd3cc5fccc"} Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.927548 4970 generic.go:334] "Generic (PLEG): container finished" podID="4e835339-2c2a-4195-a2b9-d76a7741f412" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" exitCode=0 Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.927577 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" event={"ID":"4e835339-2c2a-4195-a2b9-d76a7741f412","Type":"ContainerDied","Data":"733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd"} Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.927601 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" event={"ID":"4e835339-2c2a-4195-a2b9-d76a7741f412","Type":"ContainerDied","Data":"3d229bdff3d5e33032adbb2c7d4c6a406d72825d0ecb167f08ed024323122757"} Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.927600 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.927614 4970 scope.go:117] "RemoveContainer" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.936000 4970 scope.go:117] "RemoveContainer" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.960134 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn"] Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.962664 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7mpcn"] Nov 28 13:52:51 crc kubenswrapper[4970]: E1128 13:52:51.963415 4970 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_operator_rabbitmq-cluster-operator-779fc9694b-7mpcn_openstack-operators_4e835339-2c2a-4195-a2b9-d76a7741f412_0 in pod sandbox 3d229bdff3d5e33032adbb2c7d4c6a406d72825d0ecb167f08ed024323122757 from index: no such id: '733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd'" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" Nov 28 13:52:51 crc kubenswrapper[4970]: E1128 13:52:51.963478 4970 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_operator_rabbitmq-cluster-operator-779fc9694b-7mpcn_openstack-operators_4e835339-2c2a-4195-a2b9-d76a7741f412_0 in pod sandbox 3d229bdff3d5e33032adbb2c7d4c6a406d72825d0ecb167f08ed024323122757 from index: no such id: '733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd'" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.963505 4970 scope.go:117] "RemoveContainer" containerID="4a6d69741843b952d01701dd4386ce70c005c8b765423a5b9aaeb7abd0d72438" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.963598 4970 scope.go:117] "RemoveContainer" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" Nov 28 13:52:51 crc kubenswrapper[4970]: E1128 13:52:51.964711 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd\": container with ID starting with 733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd not found: ID does not exist" containerID="733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.964736 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd"} err="failed to get container status \"733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd\": rpc error: code = NotFound desc = could not find container \"733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd\": container with ID starting with 733cd32aae0bad2318a592a2d916c7e50fca822264042a0d33f39fa3adcef0fd not found: ID does not exist" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.980313 4970 scope.go:117] "RemoveContainer" containerID="4f5016f71be40c8fa0412d2d3623256df8cd9a5426d0c0b6ba7b92eac73c3a8a" Nov 28 13:52:51 crc kubenswrapper[4970]: I1128 13:52:51.999197 4970 scope.go:117] "RemoveContainer" containerID="cfd83d7153ecbe17ab6dbd52dc5de1437c77a0f3d12515724e782589a27622d1" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.024011 4970 scope.go:117] "RemoveContainer" containerID="150e51d52ee773b3c719f948fff10efa3b4ad3ac77b33ee8ae2891e1053d29df" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.062014 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.065024 4970 scope.go:117] "RemoveContainer" containerID="edbbe998ab544d7f6aaa537697b28bda62dfeb981181612410fb6aa92dfbc21a" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.090506 4970 scope.go:117] "RemoveContainer" containerID="ed6e63b47f11e2b0b740d49500df04701d0adff42d441e36143c3cec25e03beb" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.108136 4970 scope.go:117] "RemoveContainer" containerID="f03ef2cea48b019af731ad7a6a03ffa6855789bef5b3a51ec250f24482175461" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.123332 4970 scope.go:117] "RemoveContainer" containerID="bc0871b24a00f4320ff309bb37f630da06ef0fe23bdeefdd761346dd2ffed4b4" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.136124 4970 scope.go:117] "RemoveContainer" containerID="ac839bd992f65562180f2ee157a333bad648870db1d9b2703daa6060d6d17b4b" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.156622 4970 scope.go:117] "RemoveContainer" containerID="bd0a1d0b1c41038773d89480fb84f837c99ab547ebacd02a7ca87bbd3cc5fccc" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.183379 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gq6x\" (UniqueName: \"kubernetes.io/projected/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394-kube-api-access-8gq6x\") pod \"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394\" (UID: \"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394\") " Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.188531 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394-kube-api-access-8gq6x" (OuterVolumeSpecName: "kube-api-access-8gq6x") pod "a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" (UID: "a8bd99b0-8e47-4a38-bedc-aad5cf3a7394"). InnerVolumeSpecName "kube-api-access-8gq6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.285787 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gq6x\" (UniqueName: \"kubernetes.io/projected/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394-kube-api-access-8gq6x\") on node \"crc\" DevicePath \"\"" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.935816 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.935834 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-kjd76" event={"ID":"a8bd99b0-8e47-4a38-bedc-aad5cf3a7394","Type":"ContainerDied","Data":"0428e4dd9f595a76ea17a6af15e3dbab5d4935a3a49967745edb48c3d9ef52ba"} Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.990843 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-kjd76"] Nov 28 13:52:52 crc kubenswrapper[4970]: I1128 13:52:52.997816 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-kjd76"] Nov 28 13:52:53 crc kubenswrapper[4970]: I1128 13:52:53.395210 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e835339-2c2a-4195-a2b9-d76a7741f412" path="/var/lib/kubelet/pods/4e835339-2c2a-4195-a2b9-d76a7741f412/volumes" Nov 28 13:52:53 crc kubenswrapper[4970]: I1128 13:52:53.397062 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" path="/var/lib/kubelet/pods/a8bd99b0-8e47-4a38-bedc-aad5cf3a7394/volumes" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063107 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kx4wk/must-gather-m649z"] Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063864 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2207ac08-790e-4eff-83cb-82a7b52344ff" containerName="mariadb-account-delete" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063876 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="2207ac08-790e-4eff-83cb-82a7b52344ff" containerName="mariadb-account-delete" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063894 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="kube-rbac-proxy" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063902 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="kube-rbac-proxy" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063909 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="setup-container" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063917 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="setup-container" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063929 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65f142b-ad4e-4901-9ca1-8d27e66fc59c" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063937 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65f142b-ad4e-4901-9ca1-8d27e66fc59c" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063946 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defb0064-6731-4f52-872a-a26d8e82dd41" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063952 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="defb0064-6731-4f52-872a-a26d8e82dd41" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063960 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063967 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063975 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9232407d-077b-4ed6-9350-ee386d73677d" containerName="keystone-api" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063980 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="9232407d-077b-4ed6-9350-ee386d73677d" containerName="keystone-api" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.063989 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b351352a-c436-4df2-9d43-f7dde4bb6a8a" containerName="memcached" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.063994 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="b351352a-c436-4df2-9d43-f7dde4bb6a8a" containerName="memcached" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064002 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064008 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064013 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064019 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064029 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e835339-2c2a-4195-a2b9-d76a7741f412" containerName="operator" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064035 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e835339-2c2a-4195-a2b9-d76a7741f412" containerName="operator" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064043 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="rabbitmq" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064049 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="rabbitmq" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064056 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064063 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064072 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064080 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064091 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064098 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064109 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="mysql-bootstrap" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064116 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="mysql-bootstrap" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064126 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4729d0a5-be5a-4c85-83eb-f213d31f3755" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064133 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="4729d0a5-be5a-4c85-83eb-f213d31f3755" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064141 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerName="mysql-bootstrap" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064151 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerName="mysql-bootstrap" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064160 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerName="mysql-bootstrap" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064166 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerName="mysql-bootstrap" Nov 28 13:53:05 crc kubenswrapper[4970]: E1128 13:53:05.064174 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064180 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064306 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4491a2-79c8-4e5b-8f2f-6c8182f09885" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064320 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="1474c5bc-29c4-4da3-b2e9-900196941f19" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064327 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="kube-rbac-proxy" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064336 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4729d0a5-be5a-4c85-83eb-f213d31f3755" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064345 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e835339-2c2a-4195-a2b9-d76a7741f412" containerName="operator" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064353 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcf5bd0-9824-4c40-b009-8f2e50ad08b0" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064362 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="defb0064-6731-4f52-872a-a26d8e82dd41" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064369 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bd99b0-8e47-4a38-bedc-aad5cf3a7394" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064376 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="b351352a-c436-4df2-9d43-f7dde4bb6a8a" containerName="memcached" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064382 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa2f2ae-c626-4fd8-a04c-a762e271a467" containerName="rabbitmq" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064390 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="9232407d-077b-4ed6-9350-ee386d73677d" containerName="keystone-api" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064403 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b95186-66e8-493a-8eb9-79e4cd5b5a7d" containerName="manager" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064410 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="70137649-04fe-46dd-94ef-03a6ab19aecd" containerName="galera" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064418 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="2207ac08-790e-4eff-83cb-82a7b52344ff" containerName="mariadb-account-delete" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064426 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65f142b-ad4e-4901-9ca1-8d27e66fc59c" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064434 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1382f3-12a1-4d6a-84ce-8ca5fb915aa3" containerName="registry-server" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.064982 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.069618 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kx4wk"/"kube-root-ca.crt" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.069665 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kx4wk"/"openshift-service-ca.crt" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.088179 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx4wk/must-gather-m649z"] Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.158890 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492gn\" (UniqueName: \"kubernetes.io/projected/c9c75369-2b94-4afb-a56d-70278acf3671-kube-api-access-492gn\") pod \"must-gather-m649z\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.159037 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c75369-2b94-4afb-a56d-70278acf3671-must-gather-output\") pod \"must-gather-m649z\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.260054 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c75369-2b94-4afb-a56d-70278acf3671-must-gather-output\") pod \"must-gather-m649z\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.260505 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c75369-2b94-4afb-a56d-70278acf3671-must-gather-output\") pod \"must-gather-m649z\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.260515 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492gn\" (UniqueName: \"kubernetes.io/projected/c9c75369-2b94-4afb-a56d-70278acf3671-kube-api-access-492gn\") pod \"must-gather-m649z\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.279258 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492gn\" (UniqueName: \"kubernetes.io/projected/c9c75369-2b94-4afb-a56d-70278acf3671-kube-api-access-492gn\") pod \"must-gather-m649z\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.382531 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.807300 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx4wk/must-gather-m649z"] Nov 28 13:53:05 crc kubenswrapper[4970]: I1128 13:53:05.809558 4970 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 13:53:06 crc kubenswrapper[4970]: I1128 13:53:06.015450 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx4wk/must-gather-m649z" event={"ID":"c9c75369-2b94-4afb-a56d-70278acf3671","Type":"ContainerStarted","Data":"8e32b8ddbf8da95be6be0eb8faf7e459868927bdef911a8bd280badbb4225e1d"} Nov 28 13:53:14 crc kubenswrapper[4970]: I1128 13:53:14.071129 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx4wk/must-gather-m649z" event={"ID":"c9c75369-2b94-4afb-a56d-70278acf3671","Type":"ContainerStarted","Data":"bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18"} Nov 28 13:53:15 crc kubenswrapper[4970]: I1128 13:53:15.078173 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx4wk/must-gather-m649z" event={"ID":"c9c75369-2b94-4afb-a56d-70278acf3671","Type":"ContainerStarted","Data":"8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63"} Nov 28 13:53:15 crc kubenswrapper[4970]: I1128 13:53:15.094471 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kx4wk/must-gather-m649z" podStartSLOduration=2.8833944320000002 podStartE2EDuration="10.094451307s" podCreationTimestamp="2025-11-28 13:53:05 +0000 UTC" firstStartedPulling="2025-11-28 13:53:05.809499305 +0000 UTC m=+1996.662381105" lastFinishedPulling="2025-11-28 13:53:13.02055618 +0000 UTC m=+2003.873437980" observedRunningTime="2025-11-28 13:53:15.089962409 +0000 UTC m=+2005.942844209" watchObservedRunningTime="2025-11-28 13:53:15.094451307 +0000 UTC m=+2005.947333107" Nov 28 13:53:21 crc kubenswrapper[4970]: I1128 13:53:21.333514 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:53:21 crc kubenswrapper[4970]: I1128 13:53:21.334087 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:53:21 crc kubenswrapper[4970]: I1128 13:53:21.334144 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:53:21 crc kubenswrapper[4970]: I1128 13:53:21.335017 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"989ffc3243eb185213e945a6f9a9c46c00572aa829c5947a5b48998743fc78c3"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:53:21 crc kubenswrapper[4970]: I1128 13:53:21.335105 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://989ffc3243eb185213e945a6f9a9c46c00572aa829c5947a5b48998743fc78c3" gracePeriod=600 Nov 28 13:53:24 crc kubenswrapper[4970]: I1128 13:53:24.135723 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="989ffc3243eb185213e945a6f9a9c46c00572aa829c5947a5b48998743fc78c3" exitCode=0 Nov 28 13:53:24 crc kubenswrapper[4970]: I1128 13:53:24.135802 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"989ffc3243eb185213e945a6f9a9c46c00572aa829c5947a5b48998743fc78c3"} Nov 28 13:53:24 crc kubenswrapper[4970]: I1128 13:53:24.136102 4970 scope.go:117] "RemoveContainer" containerID="8085d03de453d8ba9c452eec6c018c90247e638e269f72dd81e94e709f69fd50" Nov 28 13:53:25 crc kubenswrapper[4970]: I1128 13:53:25.142942 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55"} Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.317629 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pm9fr"] Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.320210 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.330393 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pm9fr"] Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.457249 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-utilities\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.457304 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-catalog-content\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.457331 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7rhg\" (UniqueName: \"kubernetes.io/projected/66e5368d-fafc-4992-af4f-568f2b183d8b-kube-api-access-s7rhg\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.512009 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6b2n"] Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.513269 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.527546 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6b2n"] Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.558847 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-utilities\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.558901 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-catalog-content\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.558919 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7rhg\" (UniqueName: \"kubernetes.io/projected/66e5368d-fafc-4992-af4f-568f2b183d8b-kube-api-access-s7rhg\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.559413 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-utilities\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.559667 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-catalog-content\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.583896 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7rhg\" (UniqueName: \"kubernetes.io/projected/66e5368d-fafc-4992-af4f-568f2b183d8b-kube-api-access-s7rhg\") pod \"redhat-operators-pm9fr\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.660064 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8f4\" (UniqueName: \"kubernetes.io/projected/bf5b19dd-060a-4df9-b390-56c388d1cf61-kube-api-access-xf8f4\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.660475 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-utilities\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.660717 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-catalog-content\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.672863 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.761788 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8f4\" (UniqueName: \"kubernetes.io/projected/bf5b19dd-060a-4df9-b390-56c388d1cf61-kube-api-access-xf8f4\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.762075 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-utilities\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.762109 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-catalog-content\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.762543 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-utilities\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.762610 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-catalog-content\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.790954 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8f4\" (UniqueName: \"kubernetes.io/projected/bf5b19dd-060a-4df9-b390-56c388d1cf61-kube-api-access-xf8f4\") pod \"community-operators-g6b2n\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.826148 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:27 crc kubenswrapper[4970]: I1128 13:53:27.935314 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pm9fr"] Nov 28 13:53:28 crc kubenswrapper[4970]: I1128 13:53:28.164161 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerStarted","Data":"b3a4fc56a6864c468604451d274236214b143c7f82bdf25ca119c2a3b1e4da8a"} Nov 28 13:53:28 crc kubenswrapper[4970]: I1128 13:53:28.320582 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6b2n"] Nov 28 13:53:29 crc kubenswrapper[4970]: I1128 13:53:29.169594 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6b2n" event={"ID":"bf5b19dd-060a-4df9-b390-56c388d1cf61","Type":"ContainerStarted","Data":"ff229048c8d749d40cf0a9ce115b56bbcee2e05b9f71c9c326d85629dfca5474"} Nov 28 13:53:31 crc kubenswrapper[4970]: I1128 13:53:31.180771 4970 generic.go:334] "Generic (PLEG): container finished" podID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerID="ea8e968b58423d6b0ddf92e643c967c2f2020617e308722eb2e88fe15a21683c" exitCode=0 Nov 28 13:53:31 crc kubenswrapper[4970]: I1128 13:53:31.180871 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6b2n" event={"ID":"bf5b19dd-060a-4df9-b390-56c388d1cf61","Type":"ContainerDied","Data":"ea8e968b58423d6b0ddf92e643c967c2f2020617e308722eb2e88fe15a21683c"} Nov 28 13:53:31 crc kubenswrapper[4970]: I1128 13:53:31.182605 4970 generic.go:334] "Generic (PLEG): container finished" podID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerID="964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620" exitCode=0 Nov 28 13:53:31 crc kubenswrapper[4970]: I1128 13:53:31.182648 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerDied","Data":"964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620"} Nov 28 13:53:32 crc kubenswrapper[4970]: I1128 13:53:32.189306 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerStarted","Data":"67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51"} Nov 28 13:53:32 crc kubenswrapper[4970]: I1128 13:53:32.191559 4970 generic.go:334] "Generic (PLEG): container finished" podID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerID="668692c954a9582579bc78f3b5b82f4c44c518c4b944efb6b0a8365f945b63b1" exitCode=0 Nov 28 13:53:32 crc kubenswrapper[4970]: I1128 13:53:32.191610 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6b2n" event={"ID":"bf5b19dd-060a-4df9-b390-56c388d1cf61","Type":"ContainerDied","Data":"668692c954a9582579bc78f3b5b82f4c44c518c4b944efb6b0a8365f945b63b1"} Nov 28 13:53:33 crc kubenswrapper[4970]: I1128 13:53:33.198862 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6b2n" event={"ID":"bf5b19dd-060a-4df9-b390-56c388d1cf61","Type":"ContainerStarted","Data":"8e92e1fc310afa89c0870b33e8b66f63a765442ad58efc04100cf443fa492a32"} Nov 28 13:53:33 crc kubenswrapper[4970]: I1128 13:53:33.217517 4970 generic.go:334] "Generic (PLEG): container finished" podID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerID="67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51" exitCode=0 Nov 28 13:53:33 crc kubenswrapper[4970]: I1128 13:53:33.217571 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerDied","Data":"67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51"} Nov 28 13:53:33 crc kubenswrapper[4970]: I1128 13:53:33.230949 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6b2n" podStartSLOduration=4.750065829 podStartE2EDuration="6.230933695s" podCreationTimestamp="2025-11-28 13:53:27 +0000 UTC" firstStartedPulling="2025-11-28 13:53:31.18308919 +0000 UTC m=+2022.035970990" lastFinishedPulling="2025-11-28 13:53:32.663957056 +0000 UTC m=+2023.516838856" observedRunningTime="2025-11-28 13:53:33.230304367 +0000 UTC m=+2024.083186177" watchObservedRunningTime="2025-11-28 13:53:33.230933695 +0000 UTC m=+2024.083815495" Nov 28 13:53:34 crc kubenswrapper[4970]: I1128 13:53:34.224946 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerStarted","Data":"59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d"} Nov 28 13:53:34 crc kubenswrapper[4970]: I1128 13:53:34.251684 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pm9fr" podStartSLOduration=4.663955789 podStartE2EDuration="7.251666383s" podCreationTimestamp="2025-11-28 13:53:27 +0000 UTC" firstStartedPulling="2025-11-28 13:53:31.183561283 +0000 UTC m=+2022.036443093" lastFinishedPulling="2025-11-28 13:53:33.771271887 +0000 UTC m=+2024.624153687" observedRunningTime="2025-11-28 13:53:34.250573892 +0000 UTC m=+2025.103455702" watchObservedRunningTime="2025-11-28 13:53:34.251666383 +0000 UTC m=+2025.104548183" Nov 28 13:53:37 crc kubenswrapper[4970]: I1128 13:53:37.673987 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:37 crc kubenswrapper[4970]: I1128 13:53:37.674546 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:37 crc kubenswrapper[4970]: I1128 13:53:37.826387 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:37 crc kubenswrapper[4970]: I1128 13:53:37.826435 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:37 crc kubenswrapper[4970]: I1128 13:53:37.866682 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:38 crc kubenswrapper[4970]: I1128 13:53:38.283953 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:38 crc kubenswrapper[4970]: I1128 13:53:38.704650 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6b2n"] Nov 28 13:53:38 crc kubenswrapper[4970]: I1128 13:53:38.730660 4970 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pm9fr" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="registry-server" probeResult="failure" output=< Nov 28 13:53:38 crc kubenswrapper[4970]: timeout: failed to connect service ":50051" within 1s Nov 28 13:53:38 crc kubenswrapper[4970]: > Nov 28 13:53:40 crc kubenswrapper[4970]: I1128 13:53:40.255860 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g6b2n" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="registry-server" containerID="cri-o://8e92e1fc310afa89c0870b33e8b66f63a765442ad58efc04100cf443fa492a32" gracePeriod=2 Nov 28 13:53:44 crc kubenswrapper[4970]: I1128 13:53:44.285671 4970 generic.go:334] "Generic (PLEG): container finished" podID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerID="8e92e1fc310afa89c0870b33e8b66f63a765442ad58efc04100cf443fa492a32" exitCode=0 Nov 28 13:53:44 crc kubenswrapper[4970]: I1128 13:53:44.285728 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6b2n" event={"ID":"bf5b19dd-060a-4df9-b390-56c388d1cf61","Type":"ContainerDied","Data":"8e92e1fc310afa89c0870b33e8b66f63a765442ad58efc04100cf443fa492a32"} Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.468118 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.517826 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-utilities\") pod \"bf5b19dd-060a-4df9-b390-56c388d1cf61\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.517906 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-catalog-content\") pod \"bf5b19dd-060a-4df9-b390-56c388d1cf61\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.517931 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf8f4\" (UniqueName: \"kubernetes.io/projected/bf5b19dd-060a-4df9-b390-56c388d1cf61-kube-api-access-xf8f4\") pod \"bf5b19dd-060a-4df9-b390-56c388d1cf61\" (UID: \"bf5b19dd-060a-4df9-b390-56c388d1cf61\") " Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.519520 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-utilities" (OuterVolumeSpecName: "utilities") pod "bf5b19dd-060a-4df9-b390-56c388d1cf61" (UID: "bf5b19dd-060a-4df9-b390-56c388d1cf61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.524133 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5b19dd-060a-4df9-b390-56c388d1cf61-kube-api-access-xf8f4" (OuterVolumeSpecName: "kube-api-access-xf8f4") pod "bf5b19dd-060a-4df9-b390-56c388d1cf61" (UID: "bf5b19dd-060a-4df9-b390-56c388d1cf61"). InnerVolumeSpecName "kube-api-access-xf8f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.566490 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf5b19dd-060a-4df9-b390-56c388d1cf61" (UID: "bf5b19dd-060a-4df9-b390-56c388d1cf61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.619318 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.619347 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf8f4\" (UniqueName: \"kubernetes.io/projected/bf5b19dd-060a-4df9-b390-56c388d1cf61-kube-api-access-xf8f4\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:45 crc kubenswrapper[4970]: I1128 13:53:45.619357 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5b19dd-060a-4df9-b390-56c388d1cf61-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.304722 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6b2n" event={"ID":"bf5b19dd-060a-4df9-b390-56c388d1cf61","Type":"ContainerDied","Data":"ff229048c8d749d40cf0a9ce115b56bbcee2e05b9f71c9c326d85629dfca5474"} Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.304806 4970 scope.go:117] "RemoveContainer" containerID="8e92e1fc310afa89c0870b33e8b66f63a765442ad58efc04100cf443fa492a32" Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.304804 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6b2n" Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.334475 4970 scope.go:117] "RemoveContainer" containerID="668692c954a9582579bc78f3b5b82f4c44c518c4b944efb6b0a8365f945b63b1" Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.341798 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6b2n"] Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.347881 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g6b2n"] Nov 28 13:53:46 crc kubenswrapper[4970]: I1128 13:53:46.353689 4970 scope.go:117] "RemoveContainer" containerID="ea8e968b58423d6b0ddf92e643c967c2f2020617e308722eb2e88fe15a21683c" Nov 28 13:53:47 crc kubenswrapper[4970]: I1128 13:53:47.387802 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" path="/var/lib/kubelet/pods/bf5b19dd-060a-4df9-b390-56c388d1cf61/volumes" Nov 28 13:53:47 crc kubenswrapper[4970]: I1128 13:53:47.723253 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:47 crc kubenswrapper[4970]: I1128 13:53:47.766882 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:48 crc kubenswrapper[4970]: I1128 13:53:48.700884 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pm9fr"] Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.324078 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pm9fr" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="registry-server" containerID="cri-o://59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d" gracePeriod=2 Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.701310 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.886933 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7rhg\" (UniqueName: \"kubernetes.io/projected/66e5368d-fafc-4992-af4f-568f2b183d8b-kube-api-access-s7rhg\") pod \"66e5368d-fafc-4992-af4f-568f2b183d8b\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.887135 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-catalog-content\") pod \"66e5368d-fafc-4992-af4f-568f2b183d8b\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.887184 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-utilities\") pod \"66e5368d-fafc-4992-af4f-568f2b183d8b\" (UID: \"66e5368d-fafc-4992-af4f-568f2b183d8b\") " Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.888290 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-utilities" (OuterVolumeSpecName: "utilities") pod "66e5368d-fafc-4992-af4f-568f2b183d8b" (UID: "66e5368d-fafc-4992-af4f-568f2b183d8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.893342 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e5368d-fafc-4992-af4f-568f2b183d8b-kube-api-access-s7rhg" (OuterVolumeSpecName: "kube-api-access-s7rhg") pod "66e5368d-fafc-4992-af4f-568f2b183d8b" (UID: "66e5368d-fafc-4992-af4f-568f2b183d8b"). InnerVolumeSpecName "kube-api-access-s7rhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.988782 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:49 crc kubenswrapper[4970]: I1128 13:53:49.988838 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7rhg\" (UniqueName: \"kubernetes.io/projected/66e5368d-fafc-4992-af4f-568f2b183d8b-kube-api-access-s7rhg\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.017309 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66e5368d-fafc-4992-af4f-568f2b183d8b" (UID: "66e5368d-fafc-4992-af4f-568f2b183d8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.089491 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e5368d-fafc-4992-af4f-568f2b183d8b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.333978 4970 generic.go:334] "Generic (PLEG): container finished" podID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerID="59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d" exitCode=0 Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.334036 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerDied","Data":"59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d"} Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.334102 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm9fr" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.334133 4970 scope.go:117] "RemoveContainer" containerID="59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.334109 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm9fr" event={"ID":"66e5368d-fafc-4992-af4f-568f2b183d8b","Type":"ContainerDied","Data":"b3a4fc56a6864c468604451d274236214b143c7f82bdf25ca119c2a3b1e4da8a"} Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.356931 4970 scope.go:117] "RemoveContainer" containerID="67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.380756 4970 scope.go:117] "RemoveContainer" containerID="964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.390925 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pm9fr"] Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.397050 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pm9fr"] Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.416505 4970 scope.go:117] "RemoveContainer" containerID="59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d" Nov 28 13:53:50 crc kubenswrapper[4970]: E1128 13:53:50.417053 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d\": container with ID starting with 59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d not found: ID does not exist" containerID="59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.417087 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d"} err="failed to get container status \"59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d\": rpc error: code = NotFound desc = could not find container \"59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d\": container with ID starting with 59ec55a8babe694bde63eb569849d83ee3970c700f0fcb6385656a1c32e7363d not found: ID does not exist" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.417112 4970 scope.go:117] "RemoveContainer" containerID="67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51" Nov 28 13:53:50 crc kubenswrapper[4970]: E1128 13:53:50.417537 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51\": container with ID starting with 67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51 not found: ID does not exist" containerID="67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.417556 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51"} err="failed to get container status \"67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51\": rpc error: code = NotFound desc = could not find container \"67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51\": container with ID starting with 67512b53d708e9cb992ef033fdc5ebfb73747ec3ef446d8f9db88ee72774ef51 not found: ID does not exist" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.417568 4970 scope.go:117] "RemoveContainer" containerID="964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620" Nov 28 13:53:50 crc kubenswrapper[4970]: E1128 13:53:50.417913 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620\": container with ID starting with 964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620 not found: ID does not exist" containerID="964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620" Nov 28 13:53:50 crc kubenswrapper[4970]: I1128 13:53:50.417927 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620"} err="failed to get container status \"964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620\": rpc error: code = NotFound desc = could not find container \"964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620\": container with ID starting with 964cb1aeed50a581b8601549ac9f151f73777053d42d311382c6cea3aecb5620 not found: ID does not exist" Nov 28 13:53:51 crc kubenswrapper[4970]: I1128 13:53:51.392674 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" path="/var/lib/kubelet/pods/66e5368d-fafc-4992-af4f-568f2b183d8b/volumes" Nov 28 13:53:55 crc kubenswrapper[4970]: I1128 13:53:55.813332 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hvcsr_b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894/control-plane-machine-set-operator/0.log" Nov 28 13:53:55 crc kubenswrapper[4970]: I1128 13:53:55.952683 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wms6k_5998853c-3fbb-403e-b222-5a5c939dbb58/machine-api-operator/0.log" Nov 28 13:53:55 crc kubenswrapper[4970]: I1128 13:53:55.982733 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wms6k_5998853c-3fbb-403e-b222-5a5c939dbb58/kube-rbac-proxy/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.106230 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7s5w4_e1eb09e8-cbb7-416b-9683-a42a8b611239/kube-rbac-proxy/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.165791 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7s5w4_e1eb09e8-cbb7-416b-9683-a42a8b611239/controller/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.283625 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.446860 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.451376 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.456056 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.484368 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.635055 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.674419 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.680104 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.687733 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.826559 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.826624 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.841369 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/controller/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.850124 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:54:10 crc kubenswrapper[4970]: I1128 13:54:10.972374 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/frr-metrics/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.006846 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/kube-rbac-proxy-frr/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.017923 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/kube-rbac-proxy/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.193064 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/reloader/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.204937 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-tdbd5_539244ae-76b7-443b-9352-5d8d2f8da8e9/frr-k8s-webhook-server/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.351767 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/frr/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.397570 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8bd764cc-xwfzf_04e4d11f-bb00-41b3-9047-0669f0e051c2/manager/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.507937 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-778f645448-g7nc5_3c970e39-c0b1-4690-8e0a-f925a49d72a9/webhook-server/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.554888 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj6nx_ce7e9380-adac-4723-8ced-16693bce1923/kube-rbac-proxy/0.log" Nov 28 13:54:11 crc kubenswrapper[4970]: I1128 13:54:11.706647 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj6nx_ce7e9380-adac-4723-8ced-16693bce1923/speaker/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.520120 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/util/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.744317 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/util/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.763559 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/pull/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.780180 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/pull/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.892518 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/util/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.914921 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/pull/0.log" Nov 28 13:54:33 crc kubenswrapper[4970]: I1128 13:54:33.935400 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/extract/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.057580 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-utilities/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.219191 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-utilities/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.221091 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-content/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.243909 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-content/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.398075 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-utilities/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.430632 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-content/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.677194 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/registry-server/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.733322 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-utilities/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.936591 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-utilities/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.963790 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-content/0.log" Nov 28 13:54:34 crc kubenswrapper[4970]: I1128 13:54:34.967869 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-content/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.080780 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-utilities/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.127483 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-content/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.318857 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6lxld_cea9d8a5-d14f-4f9c-a800-815168dd799e/marketplace-operator/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.385961 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-utilities/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.510826 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/registry-server/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.555938 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-content/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.623252 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-utilities/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.631977 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-content/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.747171 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-utilities/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.770702 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-content/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.898702 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/registry-server/0.log" Nov 28 13:54:35 crc kubenswrapper[4970]: I1128 13:54:35.934020 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-utilities/0.log" Nov 28 13:54:36 crc kubenswrapper[4970]: I1128 13:54:36.127117 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-content/0.log" Nov 28 13:54:36 crc kubenswrapper[4970]: I1128 13:54:36.130397 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-utilities/0.log" Nov 28 13:54:36 crc kubenswrapper[4970]: I1128 13:54:36.153539 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-content/0.log" Nov 28 13:54:36 crc kubenswrapper[4970]: I1128 13:54:36.307163 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-utilities/0.log" Nov 28 13:54:36 crc kubenswrapper[4970]: I1128 13:54:36.338160 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-content/0.log" Nov 28 13:54:36 crc kubenswrapper[4970]: I1128 13:54:36.587407 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/registry-server/0.log" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.396744 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2r4m"] Nov 28 13:54:42 crc kubenswrapper[4970]: E1128 13:54:42.397523 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="extract-utilities" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397539 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="extract-utilities" Nov 28 13:54:42 crc kubenswrapper[4970]: E1128 13:54:42.397550 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="extract-content" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397556 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="extract-content" Nov 28 13:54:42 crc kubenswrapper[4970]: E1128 13:54:42.397569 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="extract-utilities" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397576 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="extract-utilities" Nov 28 13:54:42 crc kubenswrapper[4970]: E1128 13:54:42.397590 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="registry-server" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397597 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="registry-server" Nov 28 13:54:42 crc kubenswrapper[4970]: E1128 13:54:42.397606 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="extract-content" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397612 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="extract-content" Nov 28 13:54:42 crc kubenswrapper[4970]: E1128 13:54:42.397625 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="registry-server" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397631 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="registry-server" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397735 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e5368d-fafc-4992-af4f-568f2b183d8b" containerName="registry-server" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.397746 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5b19dd-060a-4df9-b390-56c388d1cf61" containerName="registry-server" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.398581 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.408075 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2r4m"] Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.437349 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-utilities\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.437404 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k2ht\" (UniqueName: \"kubernetes.io/projected/7ac19047-ab1d-4629-9895-7df74b96dca9-kube-api-access-5k2ht\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.437465 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-catalog-content\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.538111 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-utilities\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.538173 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k2ht\" (UniqueName: \"kubernetes.io/projected/7ac19047-ab1d-4629-9895-7df74b96dca9-kube-api-access-5k2ht\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.538201 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-catalog-content\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.538679 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-utilities\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.538931 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-catalog-content\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.559148 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k2ht\" (UniqueName: \"kubernetes.io/projected/7ac19047-ab1d-4629-9895-7df74b96dca9-kube-api-access-5k2ht\") pod \"certified-operators-c2r4m\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:42 crc kubenswrapper[4970]: I1128 13:54:42.716502 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:43 crc kubenswrapper[4970]: I1128 13:54:43.156758 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2r4m"] Nov 28 13:54:43 crc kubenswrapper[4970]: I1128 13:54:43.629288 4970 generic.go:334] "Generic (PLEG): container finished" podID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerID="5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa" exitCode=0 Nov 28 13:54:43 crc kubenswrapper[4970]: I1128 13:54:43.629631 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2r4m" event={"ID":"7ac19047-ab1d-4629-9895-7df74b96dca9","Type":"ContainerDied","Data":"5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa"} Nov 28 13:54:43 crc kubenswrapper[4970]: I1128 13:54:43.629668 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2r4m" event={"ID":"7ac19047-ab1d-4629-9895-7df74b96dca9","Type":"ContainerStarted","Data":"5049b0569ce862c3ee752cbdcf596b6d914cfcfc304226fe8fa14e281051777f"} Nov 28 13:54:45 crc kubenswrapper[4970]: I1128 13:54:45.641827 4970 generic.go:334] "Generic (PLEG): container finished" podID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerID="4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652" exitCode=0 Nov 28 13:54:45 crc kubenswrapper[4970]: I1128 13:54:45.641890 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2r4m" event={"ID":"7ac19047-ab1d-4629-9895-7df74b96dca9","Type":"ContainerDied","Data":"4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652"} Nov 28 13:54:46 crc kubenswrapper[4970]: I1128 13:54:46.650428 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2r4m" event={"ID":"7ac19047-ab1d-4629-9895-7df74b96dca9","Type":"ContainerStarted","Data":"8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b"} Nov 28 13:54:46 crc kubenswrapper[4970]: I1128 13:54:46.671886 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2r4m" podStartSLOduration=2.130299756 podStartE2EDuration="4.671870468s" podCreationTimestamp="2025-11-28 13:54:42 +0000 UTC" firstStartedPulling="2025-11-28 13:54:43.631576058 +0000 UTC m=+2094.484457898" lastFinishedPulling="2025-11-28 13:54:46.17314681 +0000 UTC m=+2097.026028610" observedRunningTime="2025-11-28 13:54:46.669717797 +0000 UTC m=+2097.522599597" watchObservedRunningTime="2025-11-28 13:54:46.671870468 +0000 UTC m=+2097.524752268" Nov 28 13:54:52 crc kubenswrapper[4970]: I1128 13:54:52.716690 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:52 crc kubenswrapper[4970]: I1128 13:54:52.717150 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:52 crc kubenswrapper[4970]: I1128 13:54:52.755271 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:53 crc kubenswrapper[4970]: I1128 13:54:53.778579 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:53 crc kubenswrapper[4970]: I1128 13:54:53.825662 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2r4m"] Nov 28 13:54:55 crc kubenswrapper[4970]: I1128 13:54:55.703513 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2r4m" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="registry-server" containerID="cri-o://8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b" gracePeriod=2 Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.174537 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.268379 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-catalog-content\") pod \"7ac19047-ab1d-4629-9895-7df74b96dca9\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.268743 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-utilities\") pod \"7ac19047-ab1d-4629-9895-7df74b96dca9\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.268858 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k2ht\" (UniqueName: \"kubernetes.io/projected/7ac19047-ab1d-4629-9895-7df74b96dca9-kube-api-access-5k2ht\") pod \"7ac19047-ab1d-4629-9895-7df74b96dca9\" (UID: \"7ac19047-ab1d-4629-9895-7df74b96dca9\") " Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.269588 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-utilities" (OuterVolumeSpecName: "utilities") pod "7ac19047-ab1d-4629-9895-7df74b96dca9" (UID: "7ac19047-ab1d-4629-9895-7df74b96dca9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.273523 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac19047-ab1d-4629-9895-7df74b96dca9-kube-api-access-5k2ht" (OuterVolumeSpecName: "kube-api-access-5k2ht") pod "7ac19047-ab1d-4629-9895-7df74b96dca9" (UID: "7ac19047-ab1d-4629-9895-7df74b96dca9"). InnerVolumeSpecName "kube-api-access-5k2ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.324501 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ac19047-ab1d-4629-9895-7df74b96dca9" (UID: "7ac19047-ab1d-4629-9895-7df74b96dca9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.370053 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.370096 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k2ht\" (UniqueName: \"kubernetes.io/projected/7ac19047-ab1d-4629-9895-7df74b96dca9-kube-api-access-5k2ht\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.370110 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ac19047-ab1d-4629-9895-7df74b96dca9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.718488 4970 generic.go:334] "Generic (PLEG): container finished" podID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerID="8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b" exitCode=0 Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.718542 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2r4m" event={"ID":"7ac19047-ab1d-4629-9895-7df74b96dca9","Type":"ContainerDied","Data":"8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b"} Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.718572 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2r4m" event={"ID":"7ac19047-ab1d-4629-9895-7df74b96dca9","Type":"ContainerDied","Data":"5049b0569ce862c3ee752cbdcf596b6d914cfcfc304226fe8fa14e281051777f"} Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.718592 4970 scope.go:117] "RemoveContainer" containerID="8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.719351 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2r4m" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.739897 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2r4m"] Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.744322 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2r4m"] Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.749322 4970 scope.go:117] "RemoveContainer" containerID="4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.777843 4970 scope.go:117] "RemoveContainer" containerID="5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.808235 4970 scope.go:117] "RemoveContainer" containerID="8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b" Nov 28 13:54:57 crc kubenswrapper[4970]: E1128 13:54:57.808646 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b\": container with ID starting with 8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b not found: ID does not exist" containerID="8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.808772 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b"} err="failed to get container status \"8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b\": rpc error: code = NotFound desc = could not find container \"8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b\": container with ID starting with 8195ead5288798b447d6ad52f749f4fd36a1d93fd6c211a3716ff367570a4d2b not found: ID does not exist" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.808858 4970 scope.go:117] "RemoveContainer" containerID="4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652" Nov 28 13:54:57 crc kubenswrapper[4970]: E1128 13:54:57.809197 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652\": container with ID starting with 4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652 not found: ID does not exist" containerID="4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.809315 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652"} err="failed to get container status \"4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652\": rpc error: code = NotFound desc = could not find container \"4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652\": container with ID starting with 4e93d588e368b28a9ebca382337d58fdc31be9b07bf92095fc424513d58e7652 not found: ID does not exist" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.809382 4970 scope.go:117] "RemoveContainer" containerID="5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa" Nov 28 13:54:57 crc kubenswrapper[4970]: E1128 13:54:57.809832 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa\": container with ID starting with 5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa not found: ID does not exist" containerID="5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa" Nov 28 13:54:57 crc kubenswrapper[4970]: I1128 13:54:57.809921 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa"} err="failed to get container status \"5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa\": rpc error: code = NotFound desc = could not find container \"5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa\": container with ID starting with 5de5adb2d928e404fca567e3ac0a73e9d1c75ff4b9acd2000cc8885ad7517aaa not found: ID does not exist" Nov 28 13:54:59 crc kubenswrapper[4970]: I1128 13:54:59.390785 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" path="/var/lib/kubelet/pods/7ac19047-ab1d-4629-9895-7df74b96dca9/volumes" Nov 28 13:55:40 crc kubenswrapper[4970]: I1128 13:55:40.993346 4970 generic.go:334] "Generic (PLEG): container finished" podID="c9c75369-2b94-4afb-a56d-70278acf3671" containerID="bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18" exitCode=0 Nov 28 13:55:40 crc kubenswrapper[4970]: I1128 13:55:40.993445 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx4wk/must-gather-m649z" event={"ID":"c9c75369-2b94-4afb-a56d-70278acf3671","Type":"ContainerDied","Data":"bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18"} Nov 28 13:55:40 crc kubenswrapper[4970]: I1128 13:55:40.994678 4970 scope.go:117] "RemoveContainer" containerID="bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18" Nov 28 13:55:41 crc kubenswrapper[4970]: I1128 13:55:41.544649 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kx4wk_must-gather-m649z_c9c75369-2b94-4afb-a56d-70278acf3671/gather/0.log" Nov 28 13:55:48 crc kubenswrapper[4970]: I1128 13:55:48.630821 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kx4wk/must-gather-m649z"] Nov 28 13:55:48 crc kubenswrapper[4970]: I1128 13:55:48.631930 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kx4wk/must-gather-m649z" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="copy" containerID="cri-o://8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63" gracePeriod=2 Nov 28 13:55:48 crc kubenswrapper[4970]: I1128 13:55:48.634526 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kx4wk/must-gather-m649z"] Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.030030 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kx4wk_must-gather-m649z_c9c75369-2b94-4afb-a56d-70278acf3671/copy/0.log" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.030665 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.063553 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kx4wk_must-gather-m649z_c9c75369-2b94-4afb-a56d-70278acf3671/copy/0.log" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.063875 4970 generic.go:334] "Generic (PLEG): container finished" podID="c9c75369-2b94-4afb-a56d-70278acf3671" containerID="8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63" exitCode=143 Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.063925 4970 scope.go:117] "RemoveContainer" containerID="8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.063953 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx4wk/must-gather-m649z" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.082432 4970 scope.go:117] "RemoveContainer" containerID="bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.116541 4970 scope.go:117] "RemoveContainer" containerID="8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63" Nov 28 13:55:49 crc kubenswrapper[4970]: E1128 13:55:49.117646 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63\": container with ID starting with 8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63 not found: ID does not exist" containerID="8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.117678 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63"} err="failed to get container status \"8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63\": rpc error: code = NotFound desc = could not find container \"8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63\": container with ID starting with 8cde71b05ce800ca3ba88ef9604c6e563bc534873b8c2dc17013428d1ceafa63 not found: ID does not exist" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.117700 4970 scope.go:117] "RemoveContainer" containerID="bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18" Nov 28 13:55:49 crc kubenswrapper[4970]: E1128 13:55:49.118094 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18\": container with ID starting with bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18 not found: ID does not exist" containerID="bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.118115 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18"} err="failed to get container status \"bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18\": rpc error: code = NotFound desc = could not find container \"bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18\": container with ID starting with bc17a14679f5fe8e21c07602f9eca43591808521ced6da89f5760fbe2c94ee18 not found: ID does not exist" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.204203 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c75369-2b94-4afb-a56d-70278acf3671-must-gather-output\") pod \"c9c75369-2b94-4afb-a56d-70278acf3671\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.204280 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492gn\" (UniqueName: \"kubernetes.io/projected/c9c75369-2b94-4afb-a56d-70278acf3671-kube-api-access-492gn\") pod \"c9c75369-2b94-4afb-a56d-70278acf3671\" (UID: \"c9c75369-2b94-4afb-a56d-70278acf3671\") " Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.213374 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c75369-2b94-4afb-a56d-70278acf3671-kube-api-access-492gn" (OuterVolumeSpecName: "kube-api-access-492gn") pod "c9c75369-2b94-4afb-a56d-70278acf3671" (UID: "c9c75369-2b94-4afb-a56d-70278acf3671"). InnerVolumeSpecName "kube-api-access-492gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.255050 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c75369-2b94-4afb-a56d-70278acf3671-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c9c75369-2b94-4afb-a56d-70278acf3671" (UID: "c9c75369-2b94-4afb-a56d-70278acf3671"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.305517 4970 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9c75369-2b94-4afb-a56d-70278acf3671-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.305562 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492gn\" (UniqueName: \"kubernetes.io/projected/c9c75369-2b94-4afb-a56d-70278acf3671-kube-api-access-492gn\") on node \"crc\" DevicePath \"\"" Nov 28 13:55:49 crc kubenswrapper[4970]: I1128 13:55:49.390414 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" path="/var/lib/kubelet/pods/c9c75369-2b94-4afb-a56d-70278acf3671/volumes" Nov 28 13:55:51 crc kubenswrapper[4970]: I1128 13:55:51.334308 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:55:51 crc kubenswrapper[4970]: I1128 13:55:51.334711 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:56:21 crc kubenswrapper[4970]: I1128 13:56:21.333757 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:56:21 crc kubenswrapper[4970]: I1128 13:56:21.334253 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.468711 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qdfgc"] Nov 28 13:56:28 crc kubenswrapper[4970]: E1128 13:56:28.469482 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="extract-utilities" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469498 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="extract-utilities" Nov 28 13:56:28 crc kubenswrapper[4970]: E1128 13:56:28.469509 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="gather" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469516 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="gather" Nov 28 13:56:28 crc kubenswrapper[4970]: E1128 13:56:28.469529 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="registry-server" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469535 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="registry-server" Nov 28 13:56:28 crc kubenswrapper[4970]: E1128 13:56:28.469548 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="copy" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469554 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="copy" Nov 28 13:56:28 crc kubenswrapper[4970]: E1128 13:56:28.469563 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="extract-content" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469569 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="extract-content" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469665 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac19047-ab1d-4629-9895-7df74b96dca9" containerName="registry-server" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469676 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="copy" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.469689 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c75369-2b94-4afb-a56d-70278acf3671" containerName="gather" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.470441 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.494364 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdfgc"] Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.631480 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-catalog-content\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.631588 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-utilities\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.631705 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8hj\" (UniqueName: \"kubernetes.io/projected/11be63f5-e7f3-41b4-90b4-84890b429ced-kube-api-access-ff8hj\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.733272 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8hj\" (UniqueName: \"kubernetes.io/projected/11be63f5-e7f3-41b4-90b4-84890b429ced-kube-api-access-ff8hj\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.733374 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-catalog-content\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.733446 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-utilities\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.734076 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-catalog-content\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.734180 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-utilities\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.753462 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8hj\" (UniqueName: \"kubernetes.io/projected/11be63f5-e7f3-41b4-90b4-84890b429ced-kube-api-access-ff8hj\") pod \"redhat-marketplace-qdfgc\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.809499 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:28 crc kubenswrapper[4970]: I1128 13:56:28.992938 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdfgc"] Nov 28 13:56:29 crc kubenswrapper[4970]: I1128 13:56:29.367468 4970 generic.go:334] "Generic (PLEG): container finished" podID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerID="3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3" exitCode=0 Nov 28 13:56:29 crc kubenswrapper[4970]: I1128 13:56:29.367514 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdfgc" event={"ID":"11be63f5-e7f3-41b4-90b4-84890b429ced","Type":"ContainerDied","Data":"3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3"} Nov 28 13:56:29 crc kubenswrapper[4970]: I1128 13:56:29.367544 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdfgc" event={"ID":"11be63f5-e7f3-41b4-90b4-84890b429ced","Type":"ContainerStarted","Data":"4fff6a77a6623b9ae34f6e16bb533e0d3225baaae430ba74c23fbf97ebb7b9c6"} Nov 28 13:56:31 crc kubenswrapper[4970]: I1128 13:56:31.381478 4970 generic.go:334] "Generic (PLEG): container finished" podID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerID="0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a" exitCode=0 Nov 28 13:56:31 crc kubenswrapper[4970]: I1128 13:56:31.401697 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdfgc" event={"ID":"11be63f5-e7f3-41b4-90b4-84890b429ced","Type":"ContainerDied","Data":"0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a"} Nov 28 13:56:32 crc kubenswrapper[4970]: I1128 13:56:32.391885 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdfgc" event={"ID":"11be63f5-e7f3-41b4-90b4-84890b429ced","Type":"ContainerStarted","Data":"bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85"} Nov 28 13:56:38 crc kubenswrapper[4970]: I1128 13:56:38.809968 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:38 crc kubenswrapper[4970]: I1128 13:56:38.810822 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:38 crc kubenswrapper[4970]: I1128 13:56:38.853897 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:38 crc kubenswrapper[4970]: I1128 13:56:38.877181 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qdfgc" podStartSLOduration=8.460111457 podStartE2EDuration="10.877157441s" podCreationTimestamp="2025-11-28 13:56:28 +0000 UTC" firstStartedPulling="2025-11-28 13:56:29.36923461 +0000 UTC m=+2200.222116430" lastFinishedPulling="2025-11-28 13:56:31.786280574 +0000 UTC m=+2202.639162414" observedRunningTime="2025-11-28 13:56:32.410830644 +0000 UTC m=+2203.263712444" watchObservedRunningTime="2025-11-28 13:56:38.877157441 +0000 UTC m=+2209.730039281" Nov 28 13:56:39 crc kubenswrapper[4970]: I1128 13:56:39.484591 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:39 crc kubenswrapper[4970]: I1128 13:56:39.537296 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdfgc"] Nov 28 13:56:41 crc kubenswrapper[4970]: I1128 13:56:41.450673 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qdfgc" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="registry-server" containerID="cri-o://bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85" gracePeriod=2 Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.358237 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.457180 4970 generic.go:334] "Generic (PLEG): container finished" podID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerID="bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85" exitCode=0 Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.457256 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdfgc" event={"ID":"11be63f5-e7f3-41b4-90b4-84890b429ced","Type":"ContainerDied","Data":"bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85"} Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.457303 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdfgc" event={"ID":"11be63f5-e7f3-41b4-90b4-84890b429ced","Type":"ContainerDied","Data":"4fff6a77a6623b9ae34f6e16bb533e0d3225baaae430ba74c23fbf97ebb7b9c6"} Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.457321 4970 scope.go:117] "RemoveContainer" containerID="bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.457267 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdfgc" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.473405 4970 scope.go:117] "RemoveContainer" containerID="0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.485295 4970 scope.go:117] "RemoveContainer" containerID="3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.519677 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8hj\" (UniqueName: \"kubernetes.io/projected/11be63f5-e7f3-41b4-90b4-84890b429ced-kube-api-access-ff8hj\") pod \"11be63f5-e7f3-41b4-90b4-84890b429ced\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.519722 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-catalog-content\") pod \"11be63f5-e7f3-41b4-90b4-84890b429ced\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.519749 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-utilities\") pod \"11be63f5-e7f3-41b4-90b4-84890b429ced\" (UID: \"11be63f5-e7f3-41b4-90b4-84890b429ced\") " Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.520947 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-utilities" (OuterVolumeSpecName: "utilities") pod "11be63f5-e7f3-41b4-90b4-84890b429ced" (UID: "11be63f5-e7f3-41b4-90b4-84890b429ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.521056 4970 scope.go:117] "RemoveContainer" containerID="bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85" Nov 28 13:56:42 crc kubenswrapper[4970]: E1128 13:56:42.521540 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85\": container with ID starting with bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85 not found: ID does not exist" containerID="bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.521615 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85"} err="failed to get container status \"bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85\": rpc error: code = NotFound desc = could not find container \"bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85\": container with ID starting with bbcb34d41d8b11ec55abfbae94dbf0ec0e6515b2f7025668803013029d45dd85 not found: ID does not exist" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.521653 4970 scope.go:117] "RemoveContainer" containerID="0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a" Nov 28 13:56:42 crc kubenswrapper[4970]: E1128 13:56:42.522434 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a\": container with ID starting with 0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a not found: ID does not exist" containerID="0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.522466 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a"} err="failed to get container status \"0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a\": rpc error: code = NotFound desc = could not find container \"0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a\": container with ID starting with 0a229782f605c2736652725da96878bdd47ba82a49274f49ca938e4fc497ab6a not found: ID does not exist" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.522485 4970 scope.go:117] "RemoveContainer" containerID="3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3" Nov 28 13:56:42 crc kubenswrapper[4970]: E1128 13:56:42.522711 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3\": container with ID starting with 3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3 not found: ID does not exist" containerID="3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.522738 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3"} err="failed to get container status \"3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3\": rpc error: code = NotFound desc = could not find container \"3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3\": container with ID starting with 3425294364aaa4c025247d235a8eac4d88346672f232bdc7b962edf0bf6da0b3 not found: ID does not exist" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.527448 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11be63f5-e7f3-41b4-90b4-84890b429ced-kube-api-access-ff8hj" (OuterVolumeSpecName: "kube-api-access-ff8hj") pod "11be63f5-e7f3-41b4-90b4-84890b429ced" (UID: "11be63f5-e7f3-41b4-90b4-84890b429ced"). InnerVolumeSpecName "kube-api-access-ff8hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.537091 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11be63f5-e7f3-41b4-90b4-84890b429ced" (UID: "11be63f5-e7f3-41b4-90b4-84890b429ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.621464 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8hj\" (UniqueName: \"kubernetes.io/projected/11be63f5-e7f3-41b4-90b4-84890b429ced-kube-api-access-ff8hj\") on node \"crc\" DevicePath \"\"" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.621499 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.621510 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11be63f5-e7f3-41b4-90b4-84890b429ced-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.807831 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdfgc"] Nov 28 13:56:42 crc kubenswrapper[4970]: I1128 13:56:42.816368 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdfgc"] Nov 28 13:56:43 crc kubenswrapper[4970]: I1128 13:56:43.389530 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" path="/var/lib/kubelet/pods/11be63f5-e7f3-41b4-90b4-84890b429ced/volumes" Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.334346 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.335030 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.335113 4970 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.336205 4970 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55"} pod="openshift-machine-config-operator/machine-config-daemon-tjrng" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.336357 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" containerID="cri-o://27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" gracePeriod=600 Nov 28 13:56:51 crc kubenswrapper[4970]: E1128 13:56:51.460980 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.520585 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerDied","Data":"27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55"} Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.520542 4970 generic.go:334] "Generic (PLEG): container finished" podID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" exitCode=0 Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.520640 4970 scope.go:117] "RemoveContainer" containerID="989ffc3243eb185213e945a6f9a9c46c00572aa829c5947a5b48998743fc78c3" Nov 28 13:56:51 crc kubenswrapper[4970]: I1128 13:56:51.521434 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:56:51 crc kubenswrapper[4970]: E1128 13:56:51.521618 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:57:04 crc kubenswrapper[4970]: I1128 13:57:04.380975 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:57:04 crc kubenswrapper[4970]: E1128 13:57:04.381833 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:57:17 crc kubenswrapper[4970]: I1128 13:57:17.380532 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:57:17 crc kubenswrapper[4970]: E1128 13:57:17.381197 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:57:31 crc kubenswrapper[4970]: I1128 13:57:31.380998 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:57:31 crc kubenswrapper[4970]: E1128 13:57:31.381863 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:57:45 crc kubenswrapper[4970]: I1128 13:57:45.381001 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:57:45 crc kubenswrapper[4970]: E1128 13:57:45.381690 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:57:59 crc kubenswrapper[4970]: I1128 13:57:59.391506 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:57:59 crc kubenswrapper[4970]: E1128 13:57:59.392682 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.451475 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j45r9/must-gather-t79mp"] Nov 28 13:58:09 crc kubenswrapper[4970]: E1128 13:58:09.452570 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="extract-content" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.452591 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="extract-content" Nov 28 13:58:09 crc kubenswrapper[4970]: E1128 13:58:09.452610 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="extract-utilities" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.452621 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="extract-utilities" Nov 28 13:58:09 crc kubenswrapper[4970]: E1128 13:58:09.452645 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="registry-server" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.452658 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="registry-server" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.452850 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="11be63f5-e7f3-41b4-90b4-84890b429ced" containerName="registry-server" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.453816 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.455964 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j45r9"/"kube-root-ca.crt" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.456438 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j45r9"/"default-dockercfg-c8tz8" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.456597 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j45r9"/"openshift-service-ca.crt" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.462062 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j45r9/must-gather-t79mp"] Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.499970 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-must-gather-output\") pod \"must-gather-t79mp\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.500026 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8sdn\" (UniqueName: \"kubernetes.io/projected/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-kube-api-access-r8sdn\") pod \"must-gather-t79mp\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.601414 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-must-gather-output\") pod \"must-gather-t79mp\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.601472 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8sdn\" (UniqueName: \"kubernetes.io/projected/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-kube-api-access-r8sdn\") pod \"must-gather-t79mp\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.601907 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-must-gather-output\") pod \"must-gather-t79mp\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.622860 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8sdn\" (UniqueName: \"kubernetes.io/projected/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-kube-api-access-r8sdn\") pod \"must-gather-t79mp\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.774132 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 13:58:09 crc kubenswrapper[4970]: I1128 13:58:09.985841 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j45r9/must-gather-t79mp"] Nov 28 13:58:10 crc kubenswrapper[4970]: I1128 13:58:10.037236 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45r9/must-gather-t79mp" event={"ID":"7a0a6fdd-7298-421f-b1c0-c59c88cc542d","Type":"ContainerStarted","Data":"e476042bc77b299708d1a4925a0c29a623392c596a758003d0dc8b43b2803aef"} Nov 28 13:58:11 crc kubenswrapper[4970]: I1128 13:58:11.044680 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45r9/must-gather-t79mp" event={"ID":"7a0a6fdd-7298-421f-b1c0-c59c88cc542d","Type":"ContainerStarted","Data":"77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf"} Nov 28 13:58:11 crc kubenswrapper[4970]: I1128 13:58:11.044981 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45r9/must-gather-t79mp" event={"ID":"7a0a6fdd-7298-421f-b1c0-c59c88cc542d","Type":"ContainerStarted","Data":"6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e"} Nov 28 13:58:11 crc kubenswrapper[4970]: I1128 13:58:11.061991 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j45r9/must-gather-t79mp" podStartSLOduration=2.061968267 podStartE2EDuration="2.061968267s" podCreationTimestamp="2025-11-28 13:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-28 13:58:11.060755413 +0000 UTC m=+2301.913637213" watchObservedRunningTime="2025-11-28 13:58:11.061968267 +0000 UTC m=+2301.914850067" Nov 28 13:58:11 crc kubenswrapper[4970]: I1128 13:58:11.381538 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:58:11 crc kubenswrapper[4970]: E1128 13:58:11.381947 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:58:25 crc kubenswrapper[4970]: I1128 13:58:25.380794 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:58:25 crc kubenswrapper[4970]: E1128 13:58:25.381831 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:58:39 crc kubenswrapper[4970]: I1128 13:58:39.383332 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:58:39 crc kubenswrapper[4970]: E1128 13:58:39.384067 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:58:50 crc kubenswrapper[4970]: I1128 13:58:50.380541 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:58:50 crc kubenswrapper[4970]: E1128 13:58:50.381280 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:58:52 crc kubenswrapper[4970]: I1128 13:58:52.459262 4970 scope.go:117] "RemoveContainer" containerID="d6a44d57dac70abf7616d423ca23ce27df639c9eb6e1b84f50747db23b275a6d" Nov 28 13:58:53 crc kubenswrapper[4970]: I1128 13:58:53.214450 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hvcsr_b9e4bbc0-c71d-4cb0-82ab-a3c67a9a4894/control-plane-machine-set-operator/0.log" Nov 28 13:58:53 crc kubenswrapper[4970]: I1128 13:58:53.292349 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wms6k_5998853c-3fbb-403e-b222-5a5c939dbb58/kube-rbac-proxy/0.log" Nov 28 13:58:53 crc kubenswrapper[4970]: I1128 13:58:53.343387 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wms6k_5998853c-3fbb-403e-b222-5a5c939dbb58/machine-api-operator/0.log" Nov 28 13:59:04 crc kubenswrapper[4970]: I1128 13:59:04.381154 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:59:04 crc kubenswrapper[4970]: E1128 13:59:04.382065 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:59:07 crc kubenswrapper[4970]: I1128 13:59:07.792583 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7s5w4_e1eb09e8-cbb7-416b-9683-a42a8b611239/kube-rbac-proxy/0.log" Nov 28 13:59:07 crc kubenswrapper[4970]: I1128 13:59:07.816045 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7s5w4_e1eb09e8-cbb7-416b-9683-a42a8b611239/controller/0.log" Nov 28 13:59:07 crc kubenswrapper[4970]: I1128 13:59:07.968623 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.140771 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.151141 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.154076 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.172625 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.304397 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.304780 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.347057 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.348710 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.550310 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-frr-files/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.568492 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-metrics/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.575661 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/cp-reloader/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.582482 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/controller/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.715897 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/kube-rbac-proxy/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.767812 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/frr-metrics/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.785769 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/kube-rbac-proxy-frr/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.949324 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/reloader/0.log" Nov 28 13:59:08 crc kubenswrapper[4970]: I1128 13:59:08.996592 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-tdbd5_539244ae-76b7-443b-9352-5d8d2f8da8e9/frr-k8s-webhook-server/0.log" Nov 28 13:59:09 crc kubenswrapper[4970]: I1128 13:59:09.165267 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qf2x_d8109ee2-cb6f-4706-a9d0-93fbec9b4234/frr/0.log" Nov 28 13:59:09 crc kubenswrapper[4970]: I1128 13:59:09.169031 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8bd764cc-xwfzf_04e4d11f-bb00-41b3-9047-0669f0e051c2/manager/0.log" Nov 28 13:59:09 crc kubenswrapper[4970]: I1128 13:59:09.284238 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-778f645448-g7nc5_3c970e39-c0b1-4690-8e0a-f925a49d72a9/webhook-server/0.log" Nov 28 13:59:09 crc kubenswrapper[4970]: I1128 13:59:09.325265 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj6nx_ce7e9380-adac-4723-8ced-16693bce1923/kube-rbac-proxy/0.log" Nov 28 13:59:09 crc kubenswrapper[4970]: I1128 13:59:09.431136 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj6nx_ce7e9380-adac-4723-8ced-16693bce1923/speaker/0.log" Nov 28 13:59:17 crc kubenswrapper[4970]: I1128 13:59:17.380838 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:59:17 crc kubenswrapper[4970]: E1128 13:59:17.381737 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:59:31 crc kubenswrapper[4970]: I1128 13:59:31.381055 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:59:31 crc kubenswrapper[4970]: E1128 13:59:31.382077 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.568064 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/util/0.log" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.714897 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/pull/0.log" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.747712 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/pull/0.log" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.752386 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/util/0.log" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.883698 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/pull/0.log" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.904660 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/extract/0.log" Nov 28 13:59:33 crc kubenswrapper[4970]: I1128 13:59:33.949296 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83hlqts_03a02bc0-9c7f-440d-8cf3-0f21ddb0ff2c/util/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.033306 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-utilities/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.192695 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-content/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.205789 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-utilities/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.235283 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-content/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.424906 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-utilities/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.448334 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/extract-content/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.656116 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-utilities/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.810715 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wbnmk_fcde0e22-6f82-4495-932f-e5e57f31d4f7/registry-server/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.832063 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-content/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.841976 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-utilities/0.log" Nov 28 13:59:34 crc kubenswrapper[4970]: I1128 13:59:34.854505 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-content/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.011933 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-utilities/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.030233 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/extract-content/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.221022 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-utilities/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.248995 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6lxld_cea9d8a5-d14f-4f9c-a800-815168dd799e/marketplace-operator/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.411036 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-content/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.437404 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-utilities/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.462231 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dhrhm_bda49097-ef3b-4e2f-8f8c-cb54ea0818b7/registry-server/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.481865 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-content/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.684851 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-utilities/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.686472 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/extract-content/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.818269 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9f9f2_933ca994-f31b-4c5a-b068-8942618eb443/registry-server/0.log" Nov 28 13:59:35 crc kubenswrapper[4970]: I1128 13:59:35.882296 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-utilities/0.log" Nov 28 13:59:36 crc kubenswrapper[4970]: I1128 13:59:36.072465 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-content/0.log" Nov 28 13:59:36 crc kubenswrapper[4970]: I1128 13:59:36.073136 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-utilities/0.log" Nov 28 13:59:36 crc kubenswrapper[4970]: I1128 13:59:36.126445 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-content/0.log" Nov 28 13:59:36 crc kubenswrapper[4970]: I1128 13:59:36.339474 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-content/0.log" Nov 28 13:59:36 crc kubenswrapper[4970]: I1128 13:59:36.349593 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/extract-utilities/0.log" Nov 28 13:59:36 crc kubenswrapper[4970]: I1128 13:59:36.611313 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7zkq_dd8f781f-7121-4875-adec-2318c2ecd8e2/registry-server/0.log" Nov 28 13:59:44 crc kubenswrapper[4970]: I1128 13:59:44.381337 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:59:44 crc kubenswrapper[4970]: E1128 13:59:44.382072 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 13:59:55 crc kubenswrapper[4970]: I1128 13:59:55.380806 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 13:59:55 crc kubenswrapper[4970]: E1128 13:59:55.381526 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.152133 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf"] Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.153205 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.156477 4970 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.156999 4970 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.160056 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf"] Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.305991 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46768012-175b-4fa6-9272-397577409a24-secret-volume\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.306139 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsn7w\" (UniqueName: \"kubernetes.io/projected/46768012-175b-4fa6-9272-397577409a24-kube-api-access-wsn7w\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.306180 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46768012-175b-4fa6-9272-397577409a24-config-volume\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.407692 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46768012-175b-4fa6-9272-397577409a24-secret-volume\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.407798 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsn7w\" (UniqueName: \"kubernetes.io/projected/46768012-175b-4fa6-9272-397577409a24-kube-api-access-wsn7w\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.407823 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46768012-175b-4fa6-9272-397577409a24-config-volume\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.409359 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46768012-175b-4fa6-9272-397577409a24-config-volume\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.413485 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46768012-175b-4fa6-9272-397577409a24-secret-volume\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.423942 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsn7w\" (UniqueName: \"kubernetes.io/projected/46768012-175b-4fa6-9272-397577409a24-kube-api-access-wsn7w\") pod \"collect-profiles-29405640-5tfcf\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.531337 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:00 crc kubenswrapper[4970]: I1128 14:00:00.747460 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf"] Nov 28 14:00:01 crc kubenswrapper[4970]: I1128 14:00:01.692718 4970 generic.go:334] "Generic (PLEG): container finished" podID="46768012-175b-4fa6-9272-397577409a24" containerID="839fa30934c23a55cf8249aa90befa79ea83cae125b873c825d4a46d89102a2d" exitCode=0 Nov 28 14:00:01 crc kubenswrapper[4970]: I1128 14:00:01.694086 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" event={"ID":"46768012-175b-4fa6-9272-397577409a24","Type":"ContainerDied","Data":"839fa30934c23a55cf8249aa90befa79ea83cae125b873c825d4a46d89102a2d"} Nov 28 14:00:01 crc kubenswrapper[4970]: I1128 14:00:01.694742 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" event={"ID":"46768012-175b-4fa6-9272-397577409a24","Type":"ContainerStarted","Data":"cda412e97316b317bb91f5b3c21c93903532d3f808a169215fff93dd141bf7f1"} Nov 28 14:00:02 crc kubenswrapper[4970]: I1128 14:00:02.922331 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.045834 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46768012-175b-4fa6-9272-397577409a24-secret-volume\") pod \"46768012-175b-4fa6-9272-397577409a24\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.045932 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsn7w\" (UniqueName: \"kubernetes.io/projected/46768012-175b-4fa6-9272-397577409a24-kube-api-access-wsn7w\") pod \"46768012-175b-4fa6-9272-397577409a24\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.045977 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46768012-175b-4fa6-9272-397577409a24-config-volume\") pod \"46768012-175b-4fa6-9272-397577409a24\" (UID: \"46768012-175b-4fa6-9272-397577409a24\") " Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.046779 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46768012-175b-4fa6-9272-397577409a24-config-volume" (OuterVolumeSpecName: "config-volume") pod "46768012-175b-4fa6-9272-397577409a24" (UID: "46768012-175b-4fa6-9272-397577409a24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.051746 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46768012-175b-4fa6-9272-397577409a24-kube-api-access-wsn7w" (OuterVolumeSpecName: "kube-api-access-wsn7w") pod "46768012-175b-4fa6-9272-397577409a24" (UID: "46768012-175b-4fa6-9272-397577409a24"). InnerVolumeSpecName "kube-api-access-wsn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.052233 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46768012-175b-4fa6-9272-397577409a24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "46768012-175b-4fa6-9272-397577409a24" (UID: "46768012-175b-4fa6-9272-397577409a24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.147343 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsn7w\" (UniqueName: \"kubernetes.io/projected/46768012-175b-4fa6-9272-397577409a24-kube-api-access-wsn7w\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.147381 4970 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46768012-175b-4fa6-9272-397577409a24-config-volume\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.147390 4970 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46768012-175b-4fa6-9272-397577409a24-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.708049 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" event={"ID":"46768012-175b-4fa6-9272-397577409a24","Type":"ContainerDied","Data":"cda412e97316b317bb91f5b3c21c93903532d3f808a169215fff93dd141bf7f1"} Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.708091 4970 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cda412e97316b317bb91f5b3c21c93903532d3f808a169215fff93dd141bf7f1" Nov 28 14:00:03 crc kubenswrapper[4970]: I1128 14:00:03.708148 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29405640-5tfcf" Nov 28 14:00:04 crc kubenswrapper[4970]: I1128 14:00:04.021720 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt"] Nov 28 14:00:04 crc kubenswrapper[4970]: I1128 14:00:04.030889 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29405595-bgxjt"] Nov 28 14:00:05 crc kubenswrapper[4970]: I1128 14:00:05.399470 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78210a9d-d2ee-4d21-a0e5-956cb8fd85d2" path="/var/lib/kubelet/pods/78210a9d-d2ee-4d21-a0e5-956cb8fd85d2/volumes" Nov 28 14:00:09 crc kubenswrapper[4970]: I1128 14:00:09.389266 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:00:09 crc kubenswrapper[4970]: E1128 14:00:09.390508 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:00:24 crc kubenswrapper[4970]: I1128 14:00:24.380270 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:00:24 crc kubenswrapper[4970]: E1128 14:00:24.380932 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:00:35 crc kubenswrapper[4970]: I1128 14:00:35.381134 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:00:35 crc kubenswrapper[4970]: E1128 14:00:35.382928 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:00:40 crc kubenswrapper[4970]: I1128 14:00:40.984874 4970 generic.go:334] "Generic (PLEG): container finished" podID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerID="6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e" exitCode=0 Nov 28 14:00:40 crc kubenswrapper[4970]: I1128 14:00:40.985002 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j45r9/must-gather-t79mp" event={"ID":"7a0a6fdd-7298-421f-b1c0-c59c88cc542d","Type":"ContainerDied","Data":"6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e"} Nov 28 14:00:40 crc kubenswrapper[4970]: I1128 14:00:40.986393 4970 scope.go:117] "RemoveContainer" containerID="6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e" Nov 28 14:00:41 crc kubenswrapper[4970]: I1128 14:00:41.983540 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j45r9_must-gather-t79mp_7a0a6fdd-7298-421f-b1c0-c59c88cc542d/gather/0.log" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.380784 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:00:50 crc kubenswrapper[4970]: E1128 14:00:50.381607 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.410652 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j45r9/must-gather-t79mp"] Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.410929 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-j45r9/must-gather-t79mp" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="copy" containerID="cri-o://77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf" gracePeriod=2 Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.416557 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j45r9/must-gather-t79mp"] Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.745300 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j45r9_must-gather-t79mp_7a0a6fdd-7298-421f-b1c0-c59c88cc542d/copy/0.log" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.746103 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.852748 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-must-gather-output\") pod \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.852882 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8sdn\" (UniqueName: \"kubernetes.io/projected/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-kube-api-access-r8sdn\") pod \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\" (UID: \"7a0a6fdd-7298-421f-b1c0-c59c88cc542d\") " Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.858455 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-kube-api-access-r8sdn" (OuterVolumeSpecName: "kube-api-access-r8sdn") pod "7a0a6fdd-7298-421f-b1c0-c59c88cc542d" (UID: "7a0a6fdd-7298-421f-b1c0-c59c88cc542d"). InnerVolumeSpecName "kube-api-access-r8sdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.919189 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7a0a6fdd-7298-421f-b1c0-c59c88cc542d" (UID: "7a0a6fdd-7298-421f-b1c0-c59c88cc542d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.954645 4970 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:50 crc kubenswrapper[4970]: I1128 14:00:50.954681 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8sdn\" (UniqueName: \"kubernetes.io/projected/7a0a6fdd-7298-421f-b1c0-c59c88cc542d-kube-api-access-r8sdn\") on node \"crc\" DevicePath \"\"" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.050632 4970 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j45r9_must-gather-t79mp_7a0a6fdd-7298-421f-b1c0-c59c88cc542d/copy/0.log" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.051010 4970 generic.go:334] "Generic (PLEG): container finished" podID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerID="77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf" exitCode=143 Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.051077 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j45r9/must-gather-t79mp" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.051093 4970 scope.go:117] "RemoveContainer" containerID="77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.073464 4970 scope.go:117] "RemoveContainer" containerID="6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.118727 4970 scope.go:117] "RemoveContainer" containerID="77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf" Nov 28 14:00:51 crc kubenswrapper[4970]: E1128 14:00:51.119185 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf\": container with ID starting with 77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf not found: ID does not exist" containerID="77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.119240 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf"} err="failed to get container status \"77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf\": rpc error: code = NotFound desc = could not find container \"77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf\": container with ID starting with 77a478e548f59efa16dbcc1aad2f286b84b02d8e9fd6f8fa119ced5a9ec78edf not found: ID does not exist" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.119267 4970 scope.go:117] "RemoveContainer" containerID="6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e" Nov 28 14:00:51 crc kubenswrapper[4970]: E1128 14:00:51.122521 4970 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e\": container with ID starting with 6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e not found: ID does not exist" containerID="6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.122562 4970 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e"} err="failed to get container status \"6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e\": rpc error: code = NotFound desc = could not find container \"6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e\": container with ID starting with 6c8bdbcf81af38755a73de73f826d792ba5e4a571fee9c6f18adc4a9e097885e not found: ID does not exist" Nov 28 14:00:51 crc kubenswrapper[4970]: I1128 14:00:51.388001 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" path="/var/lib/kubelet/pods/7a0a6fdd-7298-421f-b1c0-c59c88cc542d/volumes" Nov 28 14:00:52 crc kubenswrapper[4970]: I1128 14:00:52.510560 4970 scope.go:117] "RemoveContainer" containerID="671fa61148bf67534230c0ca5cca4e9f9e870c2997e7c935de145e1b90e3ed1b" Nov 28 14:01:04 crc kubenswrapper[4970]: I1128 14:01:04.380475 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:01:04 crc kubenswrapper[4970]: E1128 14:01:04.381334 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:01:15 crc kubenswrapper[4970]: I1128 14:01:15.381479 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:01:15 crc kubenswrapper[4970]: E1128 14:01:15.382237 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:01:29 crc kubenswrapper[4970]: I1128 14:01:29.385288 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:01:29 crc kubenswrapper[4970]: E1128 14:01:29.386572 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:01:41 crc kubenswrapper[4970]: I1128 14:01:41.380917 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:01:41 crc kubenswrapper[4970]: E1128 14:01:41.382734 4970 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tjrng_openshift-machine-config-operator(70bedd43-c527-436e-b47b-0b9ec5b10601)\"" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" Nov 28 14:01:54 crc kubenswrapper[4970]: I1128 14:01:54.381584 4970 scope.go:117] "RemoveContainer" containerID="27500a4416cd52c2623e856038e9f8aaac3d83bbbbf963c829e97dc2c4d2fa55" Nov 28 14:01:55 crc kubenswrapper[4970]: I1128 14:01:55.458258 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" event={"ID":"70bedd43-c527-436e-b47b-0b9ec5b10601","Type":"ContainerStarted","Data":"1c50dcc6fba9498bd20ea8c29ba97015094585ab423b0afb09736c12a2394670"} Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.386001 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s7ppz"] Nov 28 14:03:42 crc kubenswrapper[4970]: E1128 14:03:42.387326 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="copy" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.387348 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="copy" Nov 28 14:03:42 crc kubenswrapper[4970]: E1128 14:03:42.387367 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46768012-175b-4fa6-9272-397577409a24" containerName="collect-profiles" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.387375 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="46768012-175b-4fa6-9272-397577409a24" containerName="collect-profiles" Nov 28 14:03:42 crc kubenswrapper[4970]: E1128 14:03:42.387391 4970 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="gather" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.387399 4970 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="gather" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.387533 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="gather" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.387551 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="46768012-175b-4fa6-9272-397577409a24" containerName="collect-profiles" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.387563 4970 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0a6fdd-7298-421f-b1c0-c59c88cc542d" containerName="copy" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.388774 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.398518 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7ppz"] Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.421330 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-catalog-content\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.421455 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jn6\" (UniqueName: \"kubernetes.io/projected/509a96e7-ef38-4857-9ca0-29c5d17cca36-kube-api-access-v2jn6\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.421661 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-utilities\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.522433 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-catalog-content\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.522491 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jn6\" (UniqueName: \"kubernetes.io/projected/509a96e7-ef38-4857-9ca0-29c5d17cca36-kube-api-access-v2jn6\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.522545 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-utilities\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.523068 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-catalog-content\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.523183 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-utilities\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.556203 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jn6\" (UniqueName: \"kubernetes.io/projected/509a96e7-ef38-4857-9ca0-29c5d17cca36-kube-api-access-v2jn6\") pod \"community-operators-s7ppz\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:42 crc kubenswrapper[4970]: I1128 14:03:42.712897 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:43 crc kubenswrapper[4970]: I1128 14:03:43.236270 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7ppz"] Nov 28 14:03:43 crc kubenswrapper[4970]: W1128 14:03:43.244035 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509a96e7_ef38_4857_9ca0_29c5d17cca36.slice/crio-ba80d66b198ce8e6a73daebb1d9f27cd65f0fc9868a5233fc7abd4187353ec63 WatchSource:0}: Error finding container ba80d66b198ce8e6a73daebb1d9f27cd65f0fc9868a5233fc7abd4187353ec63: Status 404 returned error can't find the container with id ba80d66b198ce8e6a73daebb1d9f27cd65f0fc9868a5233fc7abd4187353ec63 Nov 28 14:03:44 crc kubenswrapper[4970]: I1128 14:03:44.121361 4970 generic.go:334] "Generic (PLEG): container finished" podID="509a96e7-ef38-4857-9ca0-29c5d17cca36" containerID="e44f2767a4c83c36493074bd2b63551bf744bcfd5ca91323f094e934aaa9433f" exitCode=0 Nov 28 14:03:44 crc kubenswrapper[4970]: I1128 14:03:44.121408 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7ppz" event={"ID":"509a96e7-ef38-4857-9ca0-29c5d17cca36","Type":"ContainerDied","Data":"e44f2767a4c83c36493074bd2b63551bf744bcfd5ca91323f094e934aaa9433f"} Nov 28 14:03:44 crc kubenswrapper[4970]: I1128 14:03:44.121434 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7ppz" event={"ID":"509a96e7-ef38-4857-9ca0-29c5d17cca36","Type":"ContainerStarted","Data":"ba80d66b198ce8e6a73daebb1d9f27cd65f0fc9868a5233fc7abd4187353ec63"} Nov 28 14:03:44 crc kubenswrapper[4970]: I1128 14:03:44.123295 4970 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.201247 4970 generic.go:334] "Generic (PLEG): container finished" podID="509a96e7-ef38-4857-9ca0-29c5d17cca36" containerID="e5c90844aa831ca30afc54289d24b6359a616706aa1a2245e7a3cd6bd242dbfb" exitCode=0 Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.201501 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7ppz" event={"ID":"509a96e7-ef38-4857-9ca0-29c5d17cca36","Type":"ContainerDied","Data":"e5c90844aa831ca30afc54289d24b6359a616706aa1a2245e7a3cd6bd242dbfb"} Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.751586 4970 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lcjph"] Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.753146 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.776064 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcjph"] Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.826845 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-catalog-content\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.826890 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzpqv\" (UniqueName: \"kubernetes.io/projected/a2c5578d-3196-4005-834a-bba0543c0269-kube-api-access-wzpqv\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.826961 4970 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-utilities\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.927795 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-catalog-content\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.927839 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzpqv\" (UniqueName: \"kubernetes.io/projected/a2c5578d-3196-4005-834a-bba0543c0269-kube-api-access-wzpqv\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.927881 4970 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-utilities\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.928378 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-catalog-content\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.928412 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-utilities\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:47 crc kubenswrapper[4970]: I1128 14:03:47.950111 4970 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzpqv\" (UniqueName: \"kubernetes.io/projected/a2c5578d-3196-4005-834a-bba0543c0269-kube-api-access-wzpqv\") pod \"redhat-operators-lcjph\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:48 crc kubenswrapper[4970]: I1128 14:03:48.068400 4970 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:48 crc kubenswrapper[4970]: I1128 14:03:48.215074 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7ppz" event={"ID":"509a96e7-ef38-4857-9ca0-29c5d17cca36","Type":"ContainerStarted","Data":"bc7c486ef7eb6c528896420f20e9b1a43ec7d74b1c15b716d0a22a7c0f36e3be"} Nov 28 14:03:48 crc kubenswrapper[4970]: I1128 14:03:48.234824 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s7ppz" podStartSLOduration=2.420823544 podStartE2EDuration="6.234809974s" podCreationTimestamp="2025-11-28 14:03:42 +0000 UTC" firstStartedPulling="2025-11-28 14:03:44.123077403 +0000 UTC m=+2634.975959203" lastFinishedPulling="2025-11-28 14:03:47.937063833 +0000 UTC m=+2638.789945633" observedRunningTime="2025-11-28 14:03:48.234022872 +0000 UTC m=+2639.086904672" watchObservedRunningTime="2025-11-28 14:03:48.234809974 +0000 UTC m=+2639.087691784" Nov 28 14:03:48 crc kubenswrapper[4970]: I1128 14:03:48.497720 4970 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcjph"] Nov 28 14:03:48 crc kubenswrapper[4970]: W1128 14:03:48.505543 4970 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c5578d_3196_4005_834a_bba0543c0269.slice/crio-4012e74433cafd478a4a253920ab7ba6c0fe9b27813b94a84e536ef06f16710d WatchSource:0}: Error finding container 4012e74433cafd478a4a253920ab7ba6c0fe9b27813b94a84e536ef06f16710d: Status 404 returned error can't find the container with id 4012e74433cafd478a4a253920ab7ba6c0fe9b27813b94a84e536ef06f16710d Nov 28 14:03:49 crc kubenswrapper[4970]: I1128 14:03:49.222254 4970 generic.go:334] "Generic (PLEG): container finished" podID="a2c5578d-3196-4005-834a-bba0543c0269" containerID="b068ca2f92a9b316234cc0cd37824bada860dd228d7843d53a49eaab2c8fee2f" exitCode=0 Nov 28 14:03:49 crc kubenswrapper[4970]: I1128 14:03:49.222369 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjph" event={"ID":"a2c5578d-3196-4005-834a-bba0543c0269","Type":"ContainerDied","Data":"b068ca2f92a9b316234cc0cd37824bada860dd228d7843d53a49eaab2c8fee2f"} Nov 28 14:03:49 crc kubenswrapper[4970]: I1128 14:03:49.222538 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjph" event={"ID":"a2c5578d-3196-4005-834a-bba0543c0269","Type":"ContainerStarted","Data":"4012e74433cafd478a4a253920ab7ba6c0fe9b27813b94a84e536ef06f16710d"} Nov 28 14:03:51 crc kubenswrapper[4970]: I1128 14:03:51.235816 4970 generic.go:334] "Generic (PLEG): container finished" podID="a2c5578d-3196-4005-834a-bba0543c0269" containerID="bd648b05f65d34515fdb19a91804c8d72c89d0ed4879f189a3cc4e155668b254" exitCode=0 Nov 28 14:03:51 crc kubenswrapper[4970]: I1128 14:03:51.235942 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjph" event={"ID":"a2c5578d-3196-4005-834a-bba0543c0269","Type":"ContainerDied","Data":"bd648b05f65d34515fdb19a91804c8d72c89d0ed4879f189a3cc4e155668b254"} Nov 28 14:03:52 crc kubenswrapper[4970]: I1128 14:03:52.244202 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjph" event={"ID":"a2c5578d-3196-4005-834a-bba0543c0269","Type":"ContainerStarted","Data":"8850942352ec77d25587aafdd8199b21febe570748f2e2e17dd29438c48e7a7a"} Nov 28 14:03:52 crc kubenswrapper[4970]: I1128 14:03:52.275761 4970 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lcjph" podStartSLOduration=2.824991402 podStartE2EDuration="5.275739203s" podCreationTimestamp="2025-11-28 14:03:47 +0000 UTC" firstStartedPulling="2025-11-28 14:03:49.225404924 +0000 UTC m=+2640.078286724" lastFinishedPulling="2025-11-28 14:03:51.676152725 +0000 UTC m=+2642.529034525" observedRunningTime="2025-11-28 14:03:52.269873997 +0000 UTC m=+2643.122755837" watchObservedRunningTime="2025-11-28 14:03:52.275739203 +0000 UTC m=+2643.128621003" Nov 28 14:03:52 crc kubenswrapper[4970]: I1128 14:03:52.713575 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:52 crc kubenswrapper[4970]: I1128 14:03:52.713680 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:52 crc kubenswrapper[4970]: I1128 14:03:52.764582 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:53 crc kubenswrapper[4970]: I1128 14:03:53.294266 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:54 crc kubenswrapper[4970]: I1128 14:03:54.945877 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7ppz"] Nov 28 14:03:55 crc kubenswrapper[4970]: I1128 14:03:55.257189 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s7ppz" podUID="509a96e7-ef38-4857-9ca0-29c5d17cca36" containerName="registry-server" containerID="cri-o://bc7c486ef7eb6c528896420f20e9b1a43ec7d74b1c15b716d0a22a7c0f36e3be" gracePeriod=2 Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.264720 4970 generic.go:334] "Generic (PLEG): container finished" podID="509a96e7-ef38-4857-9ca0-29c5d17cca36" containerID="bc7c486ef7eb6c528896420f20e9b1a43ec7d74b1c15b716d0a22a7c0f36e3be" exitCode=0 Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.264766 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7ppz" event={"ID":"509a96e7-ef38-4857-9ca0-29c5d17cca36","Type":"ContainerDied","Data":"bc7c486ef7eb6c528896420f20e9b1a43ec7d74b1c15b716d0a22a7c0f36e3be"} Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.634796 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.657286 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-utilities\") pod \"509a96e7-ef38-4857-9ca0-29c5d17cca36\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.657371 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-catalog-content\") pod \"509a96e7-ef38-4857-9ca0-29c5d17cca36\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.657409 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2jn6\" (UniqueName: \"kubernetes.io/projected/509a96e7-ef38-4857-9ca0-29c5d17cca36-kube-api-access-v2jn6\") pod \"509a96e7-ef38-4857-9ca0-29c5d17cca36\" (UID: \"509a96e7-ef38-4857-9ca0-29c5d17cca36\") " Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.658448 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-utilities" (OuterVolumeSpecName: "utilities") pod "509a96e7-ef38-4857-9ca0-29c5d17cca36" (UID: "509a96e7-ef38-4857-9ca0-29c5d17cca36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.664079 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509a96e7-ef38-4857-9ca0-29c5d17cca36-kube-api-access-v2jn6" (OuterVolumeSpecName: "kube-api-access-v2jn6") pod "509a96e7-ef38-4857-9ca0-29c5d17cca36" (UID: "509a96e7-ef38-4857-9ca0-29c5d17cca36"). InnerVolumeSpecName "kube-api-access-v2jn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.716411 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "509a96e7-ef38-4857-9ca0-29c5d17cca36" (UID: "509a96e7-ef38-4857-9ca0-29c5d17cca36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.759343 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.759388 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2jn6\" (UniqueName: \"kubernetes.io/projected/509a96e7-ef38-4857-9ca0-29c5d17cca36-kube-api-access-v2jn6\") on node \"crc\" DevicePath \"\"" Nov 28 14:03:56 crc kubenswrapper[4970]: I1128 14:03:56.759400 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/509a96e7-ef38-4857-9ca0-29c5d17cca36-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.273186 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7ppz" event={"ID":"509a96e7-ef38-4857-9ca0-29c5d17cca36","Type":"ContainerDied","Data":"ba80d66b198ce8e6a73daebb1d9f27cd65f0fc9868a5233fc7abd4187353ec63"} Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.273270 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7ppz" Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.273273 4970 scope.go:117] "RemoveContainer" containerID="bc7c486ef7eb6c528896420f20e9b1a43ec7d74b1c15b716d0a22a7c0f36e3be" Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.294078 4970 scope.go:117] "RemoveContainer" containerID="e5c90844aa831ca30afc54289d24b6359a616706aa1a2245e7a3cd6bd242dbfb" Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.302464 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7ppz"] Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.306589 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s7ppz"] Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.332584 4970 scope.go:117] "RemoveContainer" containerID="e44f2767a4c83c36493074bd2b63551bf744bcfd5ca91323f094e934aaa9433f" Nov 28 14:03:57 crc kubenswrapper[4970]: I1128 14:03:57.387658 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509a96e7-ef38-4857-9ca0-29c5d17cca36" path="/var/lib/kubelet/pods/509a96e7-ef38-4857-9ca0-29c5d17cca36/volumes" Nov 28 14:03:58 crc kubenswrapper[4970]: I1128 14:03:58.069393 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:58 crc kubenswrapper[4970]: I1128 14:03:58.069461 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:58 crc kubenswrapper[4970]: I1128 14:03:58.113102 4970 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:03:58 crc kubenswrapper[4970]: I1128 14:03:58.320867 4970 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:04:00 crc kubenswrapper[4970]: I1128 14:04:00.349452 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcjph"] Nov 28 14:04:00 crc kubenswrapper[4970]: I1128 14:04:00.350492 4970 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lcjph" podUID="a2c5578d-3196-4005-834a-bba0543c0269" containerName="registry-server" containerID="cri-o://8850942352ec77d25587aafdd8199b21febe570748f2e2e17dd29438c48e7a7a" gracePeriod=2 Nov 28 14:04:04 crc kubenswrapper[4970]: I1128 14:04:04.826064 4970 generic.go:334] "Generic (PLEG): container finished" podID="a2c5578d-3196-4005-834a-bba0543c0269" containerID="8850942352ec77d25587aafdd8199b21febe570748f2e2e17dd29438c48e7a7a" exitCode=0 Nov 28 14:04:04 crc kubenswrapper[4970]: I1128 14:04:04.826171 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjph" event={"ID":"a2c5578d-3196-4005-834a-bba0543c0269","Type":"ContainerDied","Data":"8850942352ec77d25587aafdd8199b21febe570748f2e2e17dd29438c48e7a7a"} Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.022044 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.104827 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-catalog-content\") pod \"a2c5578d-3196-4005-834a-bba0543c0269\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.105297 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzpqv\" (UniqueName: \"kubernetes.io/projected/a2c5578d-3196-4005-834a-bba0543c0269-kube-api-access-wzpqv\") pod \"a2c5578d-3196-4005-834a-bba0543c0269\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.105418 4970 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-utilities\") pod \"a2c5578d-3196-4005-834a-bba0543c0269\" (UID: \"a2c5578d-3196-4005-834a-bba0543c0269\") " Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.106613 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-utilities" (OuterVolumeSpecName: "utilities") pod "a2c5578d-3196-4005-834a-bba0543c0269" (UID: "a2c5578d-3196-4005-834a-bba0543c0269"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.112771 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c5578d-3196-4005-834a-bba0543c0269-kube-api-access-wzpqv" (OuterVolumeSpecName: "kube-api-access-wzpqv") pod "a2c5578d-3196-4005-834a-bba0543c0269" (UID: "a2c5578d-3196-4005-834a-bba0543c0269"). InnerVolumeSpecName "kube-api-access-wzpqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.206946 4970 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-utilities\") on node \"crc\" DevicePath \"\"" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.207013 4970 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzpqv\" (UniqueName: \"kubernetes.io/projected/a2c5578d-3196-4005-834a-bba0543c0269-kube-api-access-wzpqv\") on node \"crc\" DevicePath \"\"" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.226942 4970 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2c5578d-3196-4005-834a-bba0543c0269" (UID: "a2c5578d-3196-4005-834a-bba0543c0269"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.308514 4970 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2c5578d-3196-4005-834a-bba0543c0269-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.834641 4970 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjph" event={"ID":"a2c5578d-3196-4005-834a-bba0543c0269","Type":"ContainerDied","Data":"4012e74433cafd478a4a253920ab7ba6c0fe9b27813b94a84e536ef06f16710d"} Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.834703 4970 scope.go:117] "RemoveContainer" containerID="8850942352ec77d25587aafdd8199b21febe570748f2e2e17dd29438c48e7a7a" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.834705 4970 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjph" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.848700 4970 scope.go:117] "RemoveContainer" containerID="bd648b05f65d34515fdb19a91804c8d72c89d0ed4879f189a3cc4e155668b254" Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.853991 4970 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcjph"] Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.859735 4970 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lcjph"] Nov 28 14:04:05 crc kubenswrapper[4970]: I1128 14:04:05.866066 4970 scope.go:117] "RemoveContainer" containerID="b068ca2f92a9b316234cc0cd37824bada860dd228d7843d53a49eaab2c8fee2f" Nov 28 14:04:07 crc kubenswrapper[4970]: I1128 14:04:07.389064 4970 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c5578d-3196-4005-834a-bba0543c0269" path="/var/lib/kubelet/pods/a2c5578d-3196-4005-834a-bba0543c0269/volumes" Nov 28 14:04:21 crc kubenswrapper[4970]: I1128 14:04:21.333855 4970 patch_prober.go:28] interesting pod/machine-config-daemon-tjrng container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 28 14:04:21 crc kubenswrapper[4970]: I1128 14:04:21.334426 4970 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tjrng" podUID="70bedd43-c527-436e-b47b-0b9ec5b10601" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"